GenCheckinNotesUpdateWorkitem task is expensive !!!


 


GenCheckinNotesUpdateWorkitem task is the most expensive and takes most of the build time. The problem is worst when you launched the build for the first time for a new build type. Why? What can be done about it?


Some statistics related to builds in Team Build (for simple HelloWorld.csproj project)



  • 1st successful build (with UpdateWorkitem flag set to false) 

    •  1 file with 5000 changesets (GCNUW task takes 98.93 % build time)
    •  5000 files in 4 changesets, with each changesets containing 1300 files, (GCNUW task takes 72.18 % build time)
    •  5000 files in 357 changesets, with each changesets containing 100 files, (GCNUW task takes 90.65 % build time) 
    • 5000 files in 5000 changesets, with each changesets containing 2 to 5 files, (GCNUW task takes 90.83 % build time

  • 2st successful build (with UpdateWorkitem flag set to false)

    • 1 file with 5000 changesets (GCNUW task takes 2.42 % build time) 
    • 5000 files in 4 changesets, with each changesets containing 1300 files, (GCNUW task takes 3.25 % build time) 
    • 5000 files in 357 changesets, with each changesets containing 100 files, (GCNUW task takes 3.8 % build time) 
    • 5000 files in 5000 changesets, with each changesets containing 2 to 5 files, (GCNUW task takes 2.15 % build time)

Disclamer: Please note that I was using low end build machines (my dev box with 512 MB ram). Moreover the data mentioned above is for a very small/dummy HelloWorld project with has negligible compile time. We had the alternative rolling build setup. When we builded all VSTF sources for the first time using Team Build it took 1 hr-17mins-28secs-57millisec to build. The build churned out 1449 changesets. 


Learning



  • CPU consumption on AT went up to the maximum of 38% (w3wp.exe) while the GCNUW task was executing on the build machine.
  • GCNUW task takes the bulk of build time in 1st build when all the items are evaluated.
  • Get task takes bulk of build time in the 2nd build when the number of change sets are very less (or none)
  • Number of files associated with each change set does not impact the build time
  • Number of changesets has the direct impact on the task times

Machine Details 



  •  Client/BM on the same machine with configuration @ 2.79 GHz, 512 MB RAM
  •  AT configuration @ 3 GHz, 1 GB RAM

  


Comments (6)

  1. Peter Brouwer says:

    Hi Manish,

    Thanx for checking up on this. As I wrote to Khushboo I find that the ‘Getting changesets’ activity is taking a lot of time because it seem to iterate over all files that are in the changesets individually. You state that: "The problem is worst when you launched the build for the first time for a new build type.". I don’t think that is completely true. I think that it depends on the number of changesets/files between 2 successfull builds (all the changesets for the build). Of course the first build of a build type will pick up all historic changes and therefore takes a lot of time.

    I’m not sure if I exactly understand what you meant by:

    – Number of files associated with each change set does not impact the build time

    – Number of changesets has the direct impact on the task times

    My practical experience is that the total number of files in all the changesets impacts the build time. For example:

    – Getting changesets for 16 changesets with a total of 4k files (almost all in one changeset) took about 26 minutes.

    – Getting changesets for 5 changesets with a total of 320 files (4 changesets of 5 files and 1 changeset of 300 files) took about 13 minutes.

    Hope this helps,

    Peter

  2. ManishAgarwal says:

    Peter,

    1) You are right. Task time depends on number of files affected and number of changesets between the two builds. This is implied by my obeservation "Number of changesets has direct impact on task execution time".

    2) You are right in your example that number of files affected between builds impact the task exec time. i.e. queryhistory on 4000 files took more time than 320 files.

    3)"Number of files associated with changeset does not impact build time" – implies that the size of changeset does not matter. It does not matter from the task execution cost(time) point of view – if one changeset contains 100 files while other changeset contains only 1 file. This is because all the files will be queried irrespective of the number of files associated with the changeset and only unique changesets will be stored and duplicate entries will be ignored.

    4) Please note that I have not considered the cost of querying associated workitems. The result might be different …

  3. Peter Brouwer says:

    Hi Manish,

    3) What do you mean: "all the files will be queried irrespective..". I don’t understand what you are describing here.

    4) We currently don’t use workitems so i can not comment on that.

    To me it looks like MSBuild is trying to duplicate the changeset/file history information from TFS Source Control into the build summary. Otherwise I can not explain why it takes such a long time. When I view the history on my TFS Source Control it only takes a couple of seconds to display all the changesets. When I then view the detail of a large changeset (4k files) it takes about 10 seconds. To me this seems to be enough information to put into the build summary, but maybe you need more information?

    I’m sure you understand how serious this problem is: what if you are working with TFS Source Control for a couple of months and then decide to add a new build type? In these months ten thousands of files/changesets could be there; the build could take hours, and if the build fails the next build will again take hours.

    Regards,

    Peter

  4. Manish Agarwal says:

    Peter, we are working on this issue. Thanks for your feedback.

    Manish

  5. rape stories says:

    Best of the text i read about a problem.

  6. Buck Hodges says:

    Carl Daniel (code) and Robert Downey (code) each wrote and posted code to show the changesets between…

Skip to main content