• Home
  • PFC
  • Computer
  • General
  • Personal
  • Posts RSS
  • Comments RSS
Blue Orange Green Pink Purple

Featured content

¡Ves a http://elfiberhabla.net para seguir mi blog! Esta web ya no es activa!

Extreme programming methodology: theory and experience

Theory

Extreme programming (XP) is an agile development methodology, based in programming “inside out”. One of its main purposes is to provide flexibility against changing requirements, rapid releases and not much heavy processes not directly related with programming. It follows a Manifesto:


  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

And twelve principles:

  1. Customer satisfaction through early and continuous delivery of valuable software
  2. Welcome changing requeriments, even late in development
  3. Deliver working software frequently (2 weeks - 2 months)
  4. Business people and developers must work together daily
  5. Build projects around motivated individuals
  6. Face-to-face conversation
  7. Working software is the primary measure of progress
  8. Agile processes promote sustainable development
  9. Continuous attention to technical excellence and good design
  10. Simplicity
  11. Self-organizing teams
  12. Regular adaptation to changing circumstances

These ones are shared by all agile development methodologies. Referring more strictly to XP, talking about 4 mainpoints into this methodology is possible:

  • There are four variables to be controlled: Cost, Time, Quality and Scope.
  • Five values to be promoted: Communication, Simplicity, Feedback, Courage and Respect.
  • Five principles that should guide us: Rapid feedback, Assuming simplicity, Incremental changes, Embracing change, Quality work
  • The twelve practises: Planning game, small releases, simple designs, automated testing, continuous integration, refactoring, pair programming, collective ownership, 40-hour week, on-site customer, coding standard, metaphor

Nevertheless, the previous statements are not always true, depends on the team and they don't need to be followed strictly (one of the advantatges of XP is the lack of full strictness against heavyweight programming processes).

These diagrams could be self-explanatories:

As it is said before, XP needs to satisfy customer by early and continuous deliveries. There is a release planning at first. Once all is planned, programmer members do their work, iterating as several times as required. Once the bucle has finished, we have to consider if we have to change something (and replan), or all is OK and can be published. Once a version is published, the project must continue improving, or otherwise the project has ended.

In every iteration (iteration stands for a programming cycle and its acceptance) a little teamwork is defined (not strictly necessary, but highly recommended), and all is programmed and tested. If the code is not accepted nor satisfies the requirements, another iteration is made. Otherwise, the piece of code is tried to be integrated with the rest of the team. If the integration is succesful, the program can be published.

This is the main process in the XP methodology. The practise could be a bit different, but not too much.

As a summary, these two tables explains the differences between an agile methodology and a heavyweight methodology like RUP (where documentation – UML diagrams and other kind of documents made by analizers and designers - is very important in this last one because the teams used to be larger):

Although they are very different processes, an agile process could become a heavyweight development process in time, but it's not the aim of this post covering this.

Practise

Practise is slightly different. In the nwiki process, the process is:

  1. First, there is a customer which needs some requirements which should be satisfied (Moodle 2.0).
  2. Define the tasks in order to satisfy that requirements.
  3. Have continuous reunions with the other teammates, and explain them objectives, purposes, workflow... all things about the project that needs to be developed.
  4. Once all developers have understood the problem and have a minimum idea about de development process (how the project will be done), split the task between them.
  5. Every X time, meet them to see how the state of the work is, changes on requirements, discussing problems, anything else...

Firstly, only a ewiki and old nwiki migration should be done. However, after Moodlemoot event, an ouwiki migration should be done too and ewiki parser will not be longer supported. This changes will be made with less cost than it could be if a heavyweight process was used (a demonstration for the advantatges of XP).

To sum up, a heavyweight programming methodology can be used in clearly defined projects (with an extensive analysis and specification), and projects where there are involved a huge amount of developers, and quality software becomes important. For smaller teams and projects which don't have too much time, Extreme programming could be a methodology to follow.

Read More 1 Comment | Posted by Davigetto | edit post

Part of the migration finished - Documentation

The logical scheme of the nwiki project has changed a bit. The conceptual models of ewiki, nwiki 1.9 and the new nwiki 2.0 are shown below:

In both cases (upgrading from nwiki 1.9 or ewiki) old tables must be saved (renamed as 'old_tablename'). In other versions of nwiki, old tables were transformed only into the new ones. Nevertheless, in this new version we can't do it.

A migration should consider the whole community who will upgrade to the new version of Moodle (and indeed, to the new version of Nwiki). There are from single users to university institutions. The last ones probably have in their databases thousands of entries to migrate (pages, wikis...), and this can be a problem: not only spending too much time migrating from old wikis; server memory could not be enough for such operation, depending on the number of pages.

The solution to this problem is importing wikis and pages dynamically.

Focus on the conceptual model of new nwiki:

  • At least, the wiki table must be migrated in the migrate process (all data can be migrated for this table without problem, and this must be done). Further explanations about this migration process can be found below.
  • There is a “wiki_instances" table. This table contains wiki identifier for an instance and who are the owner/users/groups availables to use this wiki. Filling this page at migration time would be a problem (lots of users, lots of wikis...). So at first, this table will be empty. The table will be filled dynamically in the following way:
    • Every time a user accesses for the first time (after a migration) into a wiki, a wiki_instances tuple will be added into the database.
  • “Wiki_pages” is another problem; thousands and thousands of pages. So this table won't be migrated inmediately into the process, too. Every time any user accesses to a page for the first time since the wiki was migrated, it will be migrated. (And synonyms table if it's necessary). That is why the old wiki tables need to be kept. Wiki pages will be migrated using the old wiki_pages tables (probably they will be empty progressively and when the old wiki_pages table is empty, it will be removed.

This is the mainpoint of the migration process.

Now let's see the Wiki table migration:

You can find the code in mod/wiki/db/upgrade.php in the project.

This file only can upgrade from ewiki or from nwiki 1.9. Otherwise, upgrade will not be completed and you will be notified about the problem and recommended to upgrade to nwiki 1.9 first or reinstalling ewiki.

In both cases, old wiki tables will be renamed and new tables installed. Once this is done, the migration of old tables begins. For the upgrade.php, only wiki table will be migrated (because making a full migration would last too much depending on the number of pages and wikis, reasons were explained above).

The migration of wiki table when upgrading from nwiki 1.9 to nwiki 2.0 is quite simple; most fields match in type and concept, so making the relation between fields is direct.

The migration when upgrading from ewiki to nwiki 2.0 is more complex. Some fields are direct, but others aren't. 'wtype' field from ewiki wiki table which says if students must work in groups (group), by their own (student), or they cannot edit the content (teacher). This field has a close relationship with studentmode field from new nwiki 2.0. Depending on the value of groupmode from course_modules table, studentmode value must be 0, 1 or 2:

  • 0 = students in group, users can work together.
  • 1 = separate students, students work on their own and no one except teacher can see each other work.
  • 2 = students visible, students work on their own, but they can see each other.

When wtype is group or teacher, the students must work together (groupmode attribute from course_modules will separate them, not this one), so the value must be 0. If wtype is students, depending on the groupmode attribute, studentmode will have a value of 1 or 2; if groupmode = 0 or 1(no group) then studentmode = 1. Otherwise studentmode = 2.

Another attribute is htmlmode. This attribute describes if the wiki was edited via html code or ewiki parser. It's rather simple.

Finally, in both cases (ewiki and nwiki), in order to avoid running out of memory in server where there are lots of wikis, wiki tuples are retrieved in packets of 100.

Read More 0 comments | Posted by Davigetto | edit post

Candidature for HDC

Well, HDC is an easy-to-pass subject. However it requires some effort and auto-improve spirit to achieve the main objective of the subject: Speaking in public. For that, you need to lose your shame. Be a good boss isn't easy, too, and sometimes having born-capacities seems to be required for. If I get chosen as delegate, I will ensure:

  1. Listen to all your complaints/suggestions and deliver them to the properly government institution.
  2. A good and accurate following of the subject
  3. If you're in a trouble with the subject and you need my assistance, I'll do my best to give it to you. My smile is yours.
  4. If you need some personal speaking with someone to practice your own speaking skills, if I have the chance in the properly moment, I will hear and attend you gladly.

Choose me next Monday. I can ensure you you will not regret your decision :)

Read More 0 comments | Posted by Davigetto | edit post

Bibliography

  • Moodle Oficial Webpage: http://www.moodle.org/
  • Moodle 1.9 API: http://xref.moodle.org/
  • Meaning of Abstraction: http://en.wikipedia.org/wiki/Abstraction_%28computer_science%29
  • Moodlemoot homepage: http://www.moodlemoot.org/
  • Sakai Project: http://sakaiproject.org/portal
  • Google Web Toolkit homepage: http://code.google.com/webtoolkit/
  • Mahara Webpage: http://www.mahara.org/
  • Tutorials of XML, XSLT...: http://www.w3schools.com
  • PHP Oficial Homepage: http://www.php.net
Read More 0 comments | Posted by Davigetto | edit post

My task: The Migration

Well, this afternoon I've met the rest of Nwiki workers, for discussing about how to divide the work and what programming guidelines we should follow, and how the whole work will be done.

At first, we've discussed about which tools we will use to work. These tools are:

  1. GIT: git is a kind of Control Version System, developed by Linus Torvalds. Over SVN it has the advantatge that maintains two repositories: 1 in the local machine and 1 repository on the Internet, so you make changes in your local, and when you feel good, upload all changes to the Internet server. It's quite similar to use as SVN.
  2. Trac: Trac is a Tracker. For now, we will use it to manage ourselves dividing the work (opening tickets) and notifying bugs. Push the screen to zoom it in.

Once the tools are stablished, now we can begin to work. First, we should follow the Moodle Coding Guidelines and some rules for ourselves:

  1. All functions that we program must begin with wiki_
  2. PHPDoc to all function (all function must be well-documented, description, parameters, return...)
  3. For all, functions we make, we should make a PHPUnit and test it. PHPUnit, in single words, are some test games to throw against the function you did to see if it works correctly or not. They all have a similar syntax. Moodle created their own method for testing PHPUnits instead of using a third-party software.
  4. Not using $_POST[] and $_GET[] variables to get values arriving from an external resource. Instead, we have to use optional_param() function to retrieve the value, either GET or POST.
  5. We have to keep in mind that we have to program HTML pieces of code in XHTML 1.0 Strict
  6. All CSS that we could need to add must be added into styles.php file. Moodle fetchs this file for additional CSS parameters if they are included.
  7. All permission which should be assigned must be located at mod/wiki/db/access.php file.

After that, we've discussed the database structure. This is the overall structure:

We must also document the application architecture. Too, we've mentioned some important files to consider:

  • lib.php -> functions that Moodle calls automatically.
  • locallib.php -> functions for internal use.
  • version.php -> for allowing Moodle to notice a DB upgrade
  • weblib.php -> to print screens
  • languages files
  • export folder -> for exporting methods
  • db folder -> install.xml, upgrade.php...

After all this, I've been assigned to do the Database migration from ewiki and older nwiki version to the new wiki (Probably I have to program the upgrade.php file). Pigui told me that it could take me 2 months (omg!). Let's see the hardeness of my task... Time will tidy things...

Read More 0 comments | Posted by Davigetto | edit post

Training: repair an nwiki bug

Well, as a training, before entering into the Nwiki refactoring project, my project leader asked me for repairing an nwiki bug: http://moodle.org/mod/forum/discuss.php?d=107332

This bug is located (into the module directory of nwiki) into db/upgrade.php. The bug is simply understandable: From the previous version of Nwiki to the new version of Nwiki the "wiki" table changed a bit. One of these changes was modifying the type of 1 field from String to Integer (the evaluation field). The previous developer programmed the type change inside out (with a DML function called change_type_field()). However there is a problem. If the table/field is empty, there isn't any problem. But if the evaluation field was filled previously with an string value, the conversion from string to integer cannot be done, and the upgrade fails.

My task is repair this undesired behavior. The way to do this was:

  1. Create a new auxiliar field for the evaluation field in the wiki table, in order to store there the integer value of the string.
  2. Once this auxiliar field is filled for all table tuples, erase the evaluation field and create a new one with type integer. Once this is done, dump all auxiliar field data to this new evaluation field and erase the auxiliar field.

Having DML and DDL Moodle libraries (a libraries used to update-query-manipulate moodle database) it shouldn't be too much difficult. Actually, it isn't. However I needed to use some special update SQL instruction with a WHERE clause, and I've not found any Moodle function to do such a thing, so I've done it with a function called execute_sql(SQL sentence), where you put as parameter an SQL query, and this query will be executed to the database.

What's the matter? All DBMS does not have the same SQL syntax. Functions provided by Moodle DML libraries solves this problem, but as I said before, I haven't found any upgrade function to use with a WHERE clause.

I hope Pigui will give me some assistance. But all my work will be thrown away with Nwiki refactoring, so Pigui will not spend too much time on this. And for me, I've seen I'm capable of doing it and repair bugs with work. I think I'm ready to do whatever task could be assigned to me.

This afternoon all scolarship holders will have a meeting to discuss the methodology we should follow for developing nwiki to fit with the new Moodle 2.0 architecture (probably available in early 2009). This afternoon I'm going to write a summary of the main clues of my meeting.

I post the code I made in order to repair the bug (it is incomplete and fails, but the main idea is clearly visible):

$table = new XMLDBTable('wiki');
$field = new XMLDBField('evaluation');
$field->setAttributes(XMLDB_TYPE_INTEGER, '3', XMLDB_UNSIGNED, XMLDB_NOTNULL, null, null, null, '0', 'studentdiscussion');

$auxfield = new XMLDBField('evaluation2');
$auxfield->setAttributes(XMLDB_TYPE_INTEGER, '3', XMLDB_UNSIGNED, XMLDB_NOTNULL, null, null, null, '0', 'studentdiscussion');
add_field($table, $auxfield);

/*Substitution*/
$evaluation_values = get_records('wiki','','','','id,evaluation');
foreach ($evaluation_values as $key => $single_value) {
//TODO: let's substitute the evaluation string for the integer value.
//To pigui: Falten el valor dels strings
if ($single_value->evaluation == '') {
$a = 0;
}
else if ($single_value->evaluation == '') {
$a = 1;
}
else {
$a = 2;
}
$quer = 'UPDATE '.$CFG->prefix.'wiki SET evaluation2 = '.$a.' WHERE id = '.$single_value->id;
execute_sql($quer);
}
//Now, drop table evaluation, and recreate it as an Integer field
drop_field($table,$field);
add_field($table, $field);

//Dump all evaluation2 field into new evaluation field
$quer = 'UPDATE '.$CFG->prefix.'wiki SET evaluation = evaluation2';
execute_sql($quer);

//Drop aux field
drop_field($table,$auxfield);
Read More 0 comments | Posted by Davigetto | edit post
Newer Posts Older Posts Home

The FIBer Talks

  • About
      I'm using this blog as logbook for my PFC
  • facebook

    Facebook

    Who is Davigetto?

    My photo
    Davigetto
    View my complete profile

    Blog Archive

    • ►  2009 (7)
      • ►  May (1)
      • ►  March (3)
      • ►  February (2)
      • ►  January (1)
    • ▼  2008 (37)
      • ►  December (1)
      • ▼  October (6)
        • Extreme programming methodology: theory and experi...
        • Part of the migration finished - Documentation
        • Candidature for HDC
        • Bibliography
        • My task: The Migration
        • Training: repair an nwiki bug
      • ►  September (7)
      • ►  August (2)
      • ►  July (7)
      • ►  June (7)
      • ►  May (4)
      • ►  February (3)
    • ►  2007 (12)
      • ►  December (1)
      • ►  September (1)
      • ►  August (2)
      • ►  July (2)
      • ►  June (3)
      • ►  March (3)

    Labels

    • Computer (8)
    • database (1)
    • Diary Life (1)
    • General (6)
    • Information Systems (1)
    • Me quejo (2)
    • Miscelanea (1)
    • moodle (4)
    • opensyllabus (1)
    • Personal (7)
    • PFC (26)
    • php (2)
    • Poetry (6)
    • restore (1)
    • Testing (1)
    • Web (1)
    • xml (1)
    • xslt (2)
    • zip (1)

    Last Comments

    Loading...




    • Home
    • Posts RSS
    • Comments RSS
    • PFC

    © Copyright El FIBer Habla. All rights reserved.

    Back to Top