Another ASP question…


I've been working on my website a fair bit. Working on it locally works well for some things, but the google maps stuff only works if it's running on my real domain, so I have to get it up to the website.


I used "publish" to do that, which turns out to be a bit of a mistake. When you do this, VS says that it will delete files, and asks for confirmation. I assumed this was just about overwriting the files that were up there, but it's really a "scorched earth" approach which toasted every file on my website before deploying the application.


A day later and a few $$$ lighter, I had my content back off of backup.


So, what should I be doing? Publish is convenient in that it gets everything up there, but it takes roughly forever to do so, so it's not the solution I'd prefer. I've looked at "copy website", which might work, but I presume that I would have to a) figure out what I've changed and b) copy each file up to the server in the right place. Doing this correctly (ie getting all the files and any assemblies I use) up there is pretty tough, and I don't want a site that's sometimes published and sometimes web copied...


Is there a better way of doing what I want to do? Or is the presumption that I will do all my development locally?


 


Comments (12)

  1. Kris Huggins says:

    It may be tough in a windows environment (may need to use cygwin), but rsync is a very good tool.

  2. ben says:

    If you have your stuff in versioning its easy because deploying it is as easy as getting the lastest stuff from your repository, just check in the assemblies and everything. This way, you dont need to keep track of whats changed.

  3. Phil says:

    I usually just zip up everything and just xcopy the files over the ones that currently exist on your real website. Only the files that are included in your ASP.NET Web project will be copied. It looks like the copy command will let you specify that you only want to copy those files that are in the project, and it will leace the non-project files on your website alone.

  4. The publish option in VS2003 sucks. It’s still bad in 2005

    You could try UnleashIt instead, http://www.eworldui.net/UnleashIt/

    It "sort of" integrates inside Visual Studio, but it does backup first, and replaces, not deletes and pushes again. It’s not perfect, but it’s better than VS.

  5. mvark says:

    I noticed that while google maps works works only a real domain, Virtual Earth – http://virtualearth.msn.com/ works locally and does not require signing up for an API key. The developer resources are at http://www.viavirtualearth.com/vve/Dashboard/Default.ashx

  6. Geoff Taylor says:

    True, the publish options in VS sucketh mightily. The easiest option for what you want (without installing additional software) is indeed Copy Website. Highlight _all_ files and folders in your local hierarchy, and push the two-arrowed "Synchonize" button. It will (eventually) publish all the changed files but not the unchanged ones.

  7. Haacked says:

    While developing locally, try editing your "Hosts" file so that your "Real" domain points to 127.0.0.1.

    Maybe you can trick it into working while you dev.

    If not, Unleashit works well despite its unfortunate name.

  8. Rob says:

    I’ve been using the VS 2005 Copy Web feature that shows which files have changed locally vs the server. It works great when you only want to replace a couple of files and not the whole site, which 2003 required.

    I’m using your same hosting provider and the copy web connects via FTP.

  9. RichB says:

    Use subversion and "svn update" when you want to sync to a specific version of your code.

  10. Rick Strahl says:

    I’ve fought with this for a while as well. Basically the upload options are pretty lame in VS.NET 2005. The Copy web dialog is useless unless you deploy your source files. If you don’t then the ASP.NET pre-compiler must be used and it changes teh ASPX files (it embeds a signature). So that’s out unless you want to deploy source. The Deploy Web Feature works, but only if your deployed site can go up as is (ie no changes to Web.config or other configurations), you have no non-needed files (project or solutions? Config tools etc.) and you don’t mind copying EVERYTHING again even images and other static content that likely hasn’t changed. A lot of if’s in that…

    What I’ve taken to is to use the ASP.NET Compiler, generate to a deployment directory and then copy just BIN directory content (after an initial copy everything install). This keeps the files copied up to a minimum but it’s still a PITA. It’s beyond me that the ASP.NET team did not bother to think out deployment more completely and provide an option that is ‘updateable’ more easily… more here:

    http://west-wind.com/weblog/posts/2454.aspx

  11. Chris Lundie says:

    Adding a 127.0.0.1 hosts entry does work with the Google Maps API. Just remembering to change it back is the tricky part.

  12. RichB says:

    I just want to reiterate that your /etc/hosts file is the way to do GMaps development.

Skip to main content