A new scripting language doesn’t solve everything


Yes, there are plenty of scripting languages that are much better than boring old batch. Batch files were definitely a huge improvement over SUBMIT back in 1981, but they've been showing their age for quite some time. The advanced age of boring old batch, on the other hand, means that you have millions of batch files out there that you had better not break if you know what's good for you. (Sure, in retrospect, you might decide to call the batch language a design mistake, but remember that it had to run in 64KB of memory on a 4.77MHz machine while still remaining compatible in spirit with CP/M.)

Shipping a new command shell doesn't solve everything either. For one thing, you have to decide if you are going to support classic batch files or not. Maybe you decide that you won't and prefer to force people to rewrite all their batch files into your new language. Good luck on that.

On the other hand, if you decide that you will support batch files after all, then presumably your new command shell will not execute old batch files natively, but rather will defer to CMD.EXE. And there's your problem: You see, batch files have the ability to modify environment variables and have the changes persist beyond the end of the batch file. Try it:

C> copy con marco.cmd
@set MARCO=polo
^Z
        1 file(s) copied.

C> echo %MARCO%
%MARCO%

C> marco

C> echo %MARCO%
polo

If your new command shell defers to CMD.EXE, these environment changes won't propagate back to your command shell since the batch file modifies the environment variables of CMD.EXE, not your shell. Many organizations have a system of batch files that rely on the ability to pass parameters between scripts by stashing them into environment variables. The DDK's own razzle does this, for example, in order to establish a consistent build environment and pass information to build.exe about what kind of build you're making. And I bet you have a batch file or two that sets your PROMPT or PATH environment variable or changes your current directory.

So good luck with your replacement command shell. I hope you figure out how to run batch files.

Comments (96)
  1. Anonymous coward says:

    http://www.microsoft.com/technet/scriptcenter/hubs/msh.mspx

    Looks like a nice ideea :). More power and easier.

    Shouldn’t this live together with the old commannd shell?

  2. josh says:

    Personally, I’d make people invoke cmd explicitly if they need batch files.  Presumably it’s easier to pass values out of the new shell than out of cmd, so shimming them around for a hybrid script would be possible.

  3. Carlos says:

    That’s easy!  Suppose you want to run "real.bat".  Write a temporary batch file containing:

    call "real.bat"

    set > env.tmp

    and execute it using cmd.exe.  When cmd.exe terminates extract the environment variables from "env.tmp".  You can hide this cruft from the end-user.

  4. Tim Misiak says:

    Monad is a fantastic scripting language and command shell, and will definately offer a huge boost in productivity to smart system administrators. However, this doesnt really solve certain problems, as raymond has stated before:

    http://blogs.msdn.com/oldnewthing/archive/2006/03/22/558007.aspx

    I do believe that it would be a great benefit if Monad shipped standard with some future version of windows, such as Vista SP2 or something. Release with Vista is probably not an option, but that would really be nice. It certainly can’t replace the command shell, but it can at least coexist. One great application for PowerShell/Monad/MSH would be for build environments (especially for .Net targets).

  5. Miles Archer says:

    Raymond,

    64K was the maximum memory you could have. The minimum was 16K.

    For me it was a huge step up from the 4K I had on my Altair.

  6. Adam says:

    I don’t understand what the problem is.

    1) The new command shell should support monad scripts.

    2) cmd.exe should support batch files.

    a) If you’re in cmd.exe and run a batch file, it should do what it always does.

    b) If you’re in msh (? What is name of monad shell executable?) and run a monad script, it should do whatever that’s supposed to do.

    c) If you’re in msh and run a batch file, it should launch cmd.exe and run the batch file in that. When the batch file exits, that version of cmd.exe exits but your msh environment is untouched.

    d) If you’re in msh and run cmd.exe, you’re in cmd.exe, see a)

    e) If you’re in cmd.exe and run a monad script, it should launch msh and the the monad script in that. When the monad script exits, that copy of msh exits and your cmd.exe environment is untouched.

    f) If you’re in cmd.exe and run msh, you’re in msh, see b)

    If you have batch files, they’ll all still run fine from cmd.exe. You can keep developing them if you want. There are new rules for calling batch files from msh, but that’s OK because msh is new, so there’s no backwards compatibility there to keep.

    So, what’s the problem here? How hard is it to look at other systems that have multiple command shells with incompatible command sets and see how they handle it? (e.g. sh and csh on unix)

  7. andy says:

    Doesn’t seem like PowerShell supports this:

    PS C:TMP> Gci env: -Include MARCO

    PS C:TMP> .marco.bat

    $ set MARCO=POLO

    $ set   | grep -i "marco"

    MARCO=POLO

    PS C:TMP> Gci env: -Include MARCO

    PS C:TMP>

    … the solutions above are nice enough, except that you’ll need to add them to all systems which you intend to run PS on.

    I guess we’ll end up with batch & PowerShell living together for some time, but eventuality most of the stuff people use are either upgraded/converted to PowerShell if it is necessary there or left behind.

    "A new scripting language doesn’t solve everything"

    Yes, true. But the old scripting language doesn’t always solve everything too. And as you’ve written, it is very very hard to do any change to it because some software might depend on that small bug you wanted to fix. Therefore a new language is needed.

  8. Adam says:

    Almost forgot:

    g) If you’re in some other shell (e.g. windows explorer) and run a batch file, it loads cmd.exe and runs the batch file as it always used to, as in c)

    h) If you’re in some other shell (e.g. windows explorer) and run a monad script, it loads msh and runs the monad script however it’s supposed to, as in e)

  9. James Risto says:

    I find that CMD.EXE gets better and better with each OS release. Sheesh, it has if-else to avoid some goto hell, tab file name completion, wicked FOR function, and FINDSTR as a replacement for FIND. When that runs out, I use VBScript or even a C program to enhance batch scripts.

  10. oldnewthing says:

    Adam: In other words, you’re willing to accept that batch files from msh don’t quite work right. That’s a fine decision, though not everybody will agree with it.

  11. Adam says:

    No, they don’t "quite work right" in that environment variables aren’t passed all the way back up the process tree, but so what?

    If you want to run a batch file that does that, you can run in from cmd.exe. You don’t need to ever take that away from anyone.

    (Although, I’m now wondering, does cmd.exe pass it’s environment "up the process tree"? Do other 32-bit windows programs? How far up the tree do env vars get propagated? Until you reach a change of userid? I take it env vars don’t get passed back up to login.exe, or whatever the login program is that runs as ?localsystem?)

  12. ahrh says:

    "Many organizations have a system of batch files that rely on the ability to pass parameters between scripts by stashing them into environment variables."

    This is a ridiculous way of doing anything important.  Any organization doing that can easily rewrite their crappy scripts.  On the other hand, if it is mission critical software that they refuse to rewrite correctly, then they can stick with the operating system it runs on too.

  13. BryanK says:

    It seems to me that the problem was the decision way back in DOS to make batch files able to modify their parent’s environment.

    (And yes, DOS was single tasking, so the concept of a "parent" process didn’t exactly exist.  That was part of the problem too — Unix, for instance, ran multiple processes just fine.  Of course it ran them on CPUs that were a LOT better than the 8088/8086; Intel CPUs had no way to separate processes until the 386.  So the blame is not entirely Microsoft’s; a good chunk of it is Intel’s.  Not that that helps in fixing the problem, either.)

    If the batch processor had not run the batch file in the context of itself, but had shelled out a new instance of itself instead (like sh, csh, ksh, bash, etc. all do), then no one would have gotten used to being able to modify the parent’s environment.

    > Adam: In other words, you’re willing to accept that batch files from msh don’t quite work right.

    You’re right, but they don’t work right from Explorer today.  When Explorer launches a batch file, it runs in a new instance of cmd.exe, which then exits.  The environment variables do not take effect in the instance of Explorer that launched the batch file.

    It seems to me that running batch files from msh isn’t that different from running them from Explorer.  Users don’t double-click on a batch file expecting its changes to take effect when they double-click on the next one, because that won’t work.  Whatever workaround they use for that would also work to run the batch files from msh.

  14. BillyBob says:

    >This is a ridiculous way of doing anything important.  Any organization doing that can easily rewrite their crappy scripts.  On the other hand, if it is mission critical software that they refuse to rewrite correctly, then they can stick with the operating system it runs on too  <<

    My, aren’t you being a bit self righteous?  

    Our build system currently uses batch files.  We have one batch file to set the correct paths to our output directories.   Why?  Because standardizing crap like that was considered too stupid when we could abstract it into one batch file that set the directories to the common files.  Each of our 50+ build scripts calls this one batch file to set the correct environment variables.  

  15. WhatIsTheProblem? says:

    Why would this break anything?

    "It seems to me that running batch files from msh isn’t that different from running them from Explorer.  Users don’t double-click on a batch file expecting its changes to take effect when they double-click on the next one, because that won’t work.  Whatever workaround they use for that would also work to run the batch files from msh."

    Exactly

  16. davidacoder says:

    "Shipping a new command shell doesn’t solve everything either." Who said it would solve everything? You attack an enemy that isn’t there…

    PowerShell is tackling a whole new set of problems than batch files. If you look at it as just a replacement for batch (as you seem to do), yes, a cynical comment like "Good luck…" might be appropriate. But I believe you completely missed what the planned area of use for this is. Every attempted to manage Exchange with batch files?

  17. Environment variable changes in a batch file do not affect any other already-running processes.  Batch files run within the context of the existing cmd.exe process, and affect only the environment variables of that process.  You can prevent even that with setlocal/endlocal.  Environment changes made by batch files can be inherited by child processes, but they never affect parent processes.  (That’s actually a good thing!)

  18. Anonymous Coward says:

    "It seems to me that running batch files from msh isn’t that different from running them from Explorer.  Users don’t double-click on a batch file expecting its changes to take effect when they double-click on the next one, because that won’t work."

    Well, I can see the following scenario. In fact, I bet it happens frequently after the release of PowerShell.

    A project has several CMD scripts. They all run another script that sets the env variables correctly for that project. That script might even be one of the VS scripts to setup %INCLUDE% and %PATH% for the appropriate version of VS for that project.

    Somebody sets out to add a new script. "Hey, I’ll check out Monad^H^H^H^H^HPowerShell!" First line: run CMD script to import env variables. Duoh.

  19. JCAB says:

    Behold the power of the batch file!

    http://www.lysator.liu.se/tolkien-games/entry/hobbit-true.html

    This would be an example of program that wouldn’t run from Monad. But… as several people mentioned, as long as CMD.EXE is still there and can be used as an interactive shell, that is just not a problem.

    Not everyody will agree with whichever decision is taken, no matter what it is. But you have to suck up and make your decision anyway. Leaving CMD.EXE completely separate, at least, is more future-proof. Just make sure to include it in the proper documentation (KB article, whatever).

  20. AC says:

    And what’s this "PowerShell"? I tried googling, but haven’t found something that can be a new batch language?

  21. oldnewthing says:

    Batch files are typically not run from Explorer. They are run from a command prompt.

    If everybody is in agreement that a new scripting language doesn’t have to solve everything, then why do people keep touting each new command shell as the greatest thing ever that will kill batch files dead?

  22. Dflare says:

    I just don’t see the problem. It should use the same concept as Linux systems do. The shell is a component so there should be lots of plugglabe shells, so you can use the one that suit you.

    The old .bat files should run against the old cmd, If you want something more fancy, you can use sh, bash,etc ( or msh in this case :) ), this way, is very easy to do the scripts that you want in the language that you want ( the .net philosophy) ( I will love a sC# ).

  23. oldnewthing says:

    Dflare: The problem is interoperating with other script languages. That’s my point. Interoperating with batch files is hard.

  24. Dflare says:

    Not necessary,  I mean, If every shell support the basic arquitectural structures, you can support it. Taking your example, if the enviromental variables are shared between shells you can run your .bat file inside another shell, take those variables and modify them and then run another shell etc.

    Anyway, Thankfully the problem is with one shell ( the old MSDOS ) so you can give functions or support structures to others shells so they can interact with this one shell, and then make a foundation so every new shell, could be ( in a way ) interoperable

    :D ( Anyway, I love the old msdos, but What i usually do is install the Unix tools so I can do things like ls|grep ‘hi’ on the cmd prompt)

  25. Maurits says:

    I wish setlocal was the DEFAULT behavior in batch files.  It’s rare that I want to modify the caller’s environment from a batch file, and frequent that I want to store locally-important state.

  26. davidacoder says:

    "why do people keep touting each new command shell as the greatest thing ever that will kill batch files dead?"

    Who claims that?!? People are excited about PowerShell, because it is incredibly powerful and will allow us to do things that weren’t possible with batch files. But being excited about something new does NOT imply that we believe it will kill off the old thing. I really feel your are putting words into the mouths of the people that are excited about PowerShell with sentences like that. Had the PowerShell people claimed that once they ship batch files will be dead, a comment like "good luck" (implying you believe it will never work) would have been appropriate. But I just can’t see that the people that think PowerShell is the greatest thing in "shell space" also think it will kill off batch files over night, or even over years.

  27. mikeb says:

    Regarding getting changes in a cmd script’s environment not propogating to Monad (or psh):

    Since microsoft controls both cmd.exe and psh.exe, it would be possible to distribute an updated cmd.exe with the PowerShell distribution.  This cmd.exe would be essentially the same as the current cmd.exe, but will also have a protocol that psh can use to grab it’s changed environment.

    I know that updating cmd.exe from an optional utility package seems like it would open a lot of potential problems, but it seems to me to be quite feasible.

    An alternative is that PowerShell can distribute it’s own version of cmd.exe (say, pshcmd.exe) that’s built from the same source as cmd.exe, with the additional environment change protocol built in.  Since cmd.exe’s source is used to build the thing, it’ll have 100% compatibility with cmd.exe.

  28. This is exactly why I start writing my own shell. It’s called the undeadshell and it supports a (currently) limited bash and cmd syntax. You can switch between the two by settings the variable UNDEAD_SYNTAX to bat or shell.  And one day it will let you run batch scripts from the bash side and bash scripts from the batch side.

  29. oldnewthing says:

    davidacoder: People are saying it right here in the comments!  "Any organization doing that can easily rewrite their crappy scripts."

  30. People can jump on Raymond all they want for this, but he’s absolutely right.  You’re missing the point of his post if you think that using cmd and Powershell for separate tasks is a reasonable solution.  It may be a reasonable solution for the user, but it doesn’t incentivize the user’s adoption of Powershell.

    We use batch files for our build environment setup, and we use python invoked from batch files for quite a bit.  If I know and use this large library of batch files already, what’s the incentive to jump to Powershell for some work and back into cmd for other work?

    I don’t know many people who use csh, bash, ash, tcsh, zsh, and sh for each of the little quirks.  Most everyone I know just uses bash.

    I’m not going to bother uninstalling Powershell, but it’s highly unlikely that I’ll be using it in the near future (read: next several years).  I imagine that there must be some people who think that extending cmd to pass back environment changes is worth it.

  31. Why won’t Microsoft write a new OS from the bottom up? I finished an almost-rant about that <a href="http://stuckinthecube.blogspot.com/2006/04/attention-microsoftie-infidels.html">about 15 minutes ago</a>. I actually like batch scripting. Maybe it has something to do with the fact that I can remember programming in COBOL and FORTRAN. On punch cards. Honest.

    The command shell is important. Baic bath commands and manipulations are important. If I need some customer to E-Mail the contents of a directory, DIR X: /S >> DIR.TXT takes eight seconds, but trying to do that from the GUI? Fuhgeddabowdit.

    To this day there are companies who write sh scripts and limit them to whatever can also be done in DOS batch.

  32. vince says:

    "Those who don’t understand UNIX are doomed to reinvent it, poorly."

    –Henry Spencer

  33. oldnewthing says:

    Apparently OS/2 and Windows NT were chopped liver.

  34. Gene Hamilton says:

    @ReallyEvilCanine

    How would re-writing an OS help?  All you do is wind up with a new crop of bugs and new problems.

  35. AC says:

    Well I must admit that I don’t know how I can use Win32 API to change environment variables for parent or at least for user or computer. Anybody knows that?

    For example, I’d like to make something which would permanently change PATH paths more comfortably (one path in per line) than through "My computer" properties, where I have a lot of paths all in one single line, and it’s a *real pain* to edit.

  36. Ray Trent says:

    So… You’re really going to sit there and tell us that Microsoft can’t figure out how to solve the problem of propagating environment variable changes to the parent process?

    Modesty is one thing… but that’s just ridiculous.

    But for all those pooh-poohing the problem, tell me: as stupid and ridiculous as global variables are, can you really tell us with a straight face that you’ve *never* written a program that used them? ‘Cause that’s exactly what these cross-batch-file environment variables are.

  37. Cooney says:

    This is really easy. So easy, in fact, that unix did it 30+ years ago:

    #!python

    this is a python script

    And for all of you who think that cmd is actually doing something innovative by giving you if statements and tab completion, perhaps you could ask why it wasn’t here 10 years ago? sh and bash have supported this stuff for ages. We also have value defaulting, and full on functions.

    Yeah, it makes sense that this wasn’t around for Dos1.0 – it was a quick hack written for fun, but you should’ve had all this by at least Dos 5.0

  38. WhyNewShell? says:

    Who needs a new shell?  cmd.exe can execute perl scripts :-)

    ———————-

    @rem –*-Perl-*–

    @echo off

    perl.exe -x "%~dpnx0" %*

    goto :eof

    #!perl

  39. Adam says:

    Yeah, what BrianK said.

    If "not working quite right" means "working exactly the same way that they *currently* do from all shells other than cmd.exe (e.g. windows explorer)" then you have a point.

    (Forget my rambling about passing env vars up the process tree – your post suddenly made me think that running batch files from explorer could change the explorer environment.)

  40. oldnewthing says:

    Cooney: ? How does that solve the "environment variable sharing" problem?

    If you just wanted to solve the "run a particular script with a particular interpreter", that’s already been done – create a file association saying that *.python files should be run by python.exe.

  41. Cooney says:

    There is a simple solution that you are ignoring (willfully, it appears). Make cmd default, just like today and allow people to change their interpreter. new interpreters need not be fully compatible with the foibles of cmd, as you have the option of using it or not (default to using cmd). You have to use the same command interpreter as the script in order to share env variables (just like unix). You should also make this require an artifact, as cmd is the only one that does it.

    If you run a script, it defaults to being a cmd script unless specified otherwise.

    What I’m saying is that the new cmd interpreter need not be compatible with the old one, since changing the intepreter is a deliberate act.

  42. oldnewthing says:

    That still doesn’t solve the problem where you’ve already started the alternate shell (PowerShell, bash, whatever) and then you want to run a cmd.exe batch file that relies on environment variable sharing or current directory sharing or pushd stack sharing or some other cmd.exe feature where batch files can communicate with each other after they have exited. That is the problem I’m discussing today. The #!python thing is nice but it is also irrelevant to the topic at hand.

  43. HA HA HA says:

    they’re is *no* porblam u cant slove wiht a new sriptign langage!

    especily if its dasignd acording to the inventers crackpot theries abuot alnguage design or is a subtly incompatable dilect of lisp. or better yet alla teh above.

    i meean liek *duh*?!

    sheesh.

  44. Guest says:

    Well, Microsoft could write a converter that converts cmd script files into PowerShell ones…

  45. oldnewthing says:

    Good luck converting "goto".

  46. Ivo says:

    Am I missing something obvious here?

    Why have separate exes for old and new style scripts? If you start a .bat file from the command prompt cmd.exe should execute it using the old syntax. If you start a .bat2 file it should use the new syntax. Both are interpreted by the same exe and will share the same environment.

  47. Matthew says:

    > This is a ridiculous way of doing anything important.

    Why? If it works, what is wrong with it? Seriously.

    >> That is the problem I’m discussing today.

    Raymond, you should know better than to expect people to READ, UNDERSTAND and ENGAGE with your posts. Its much easier for them to imagine some other topic you are discussing and respond to that instead.

  48. Cooney says:

    Raymond,

    If you’re in <random shell> and you invkoe magic.cmd that mucks about with environment variables, it gets a copy of the environment.

    The compatibility break is this: you don’t then copy the env back out on exit. If that’s necessary, then wrap the cmd scritps in a script and they can share the same interpreter. this env sharing is a nasty thing and deserves to die – you should only be able to do that sort of thing inside of the same cmd interpreter, such as a bash script sourcing another bash script.

    Ivo:

    > Why have separate exes for old and new style scripts?

    Because there’s no reason to tie this together. Understand that most people will leave cmd.exe alone, but the ones of us who like running bash or whatever will like being able to upgrade that stuff independently.

  49. Cooney says:

    A frankly minor interop issue between old and new scripts doesn’t lessen the value of a decent shell at last.

    IMNSHO, interop in the face of a workable way to distinguish shells (like #! syntax) is a nonissue.

    I use Unix. I’m so spoiled.

  50. BryanK says:

    Anonymous Coward:  Yes, that would be a problem.  That’s why sh/ksh/bash have the "." or "source" builtin — to run another script in the context of the current shell process, so that environment variable changes (or shell function definitions, or whatever) made in that script also take effect in the parent.  But it doesn’t work when the target script is written in another scripting language; in that case, there is no solution.

    (As Aaron said, environment variable settings are usually propagated down the process tree, but *NEVER* up it.  It just looks that way in batch files because all .bat / .cmd files called by one "parent" batch file or from the command line get interpreted by the same process.)

    Anyway, I’m no longer sure what my point is, but I think it has something to do with the fact that different languages rarely interoperate without the programmer thinking.

    In the case of monad / powershell / whatever it’s called this week, if it has a "source" equivalent, then it would be possible to rewrite the target batch file in monad (of course this creates two versions of the same script that have to be kept in sync), and then source that script.

  51. Byron Ellacott says:

    Hmm.  I’ve not yet seen someone propose that the batch file interpreter of CMD.EXE be wrapped up as a library and used directly by replacement shells.  Tada, it’s no longer a child process, and is free to screw around with the replacement shell’s environment as much as it likes.

    If worst comes to worst, it’s not even a particularly complex syntax, it can be reimplemented from scratch in a fairly short time.

  52. bramster says:

    Raymond said

    "Batch files are typically not run from Explorer. They are run from a command prompt."

    Alas, were it true.

    I’m a command line guy. . .    I write a lot of programs in which a batch file passes the output of program 1 to program 2 to program 3, etc.

    The Windeeze use explorer, and click on my batch file. . .

    Now, a lot of this discussion is waaaay over my head. . .     but I’ve seen batch files setting environment variables. . .  and I ask why.   With 300 Gb drives at $150 or so, why not just run your whole job in the current directory, and forget about all that pain?

    Am I missing something?

  53. oldnewthing says:

    Cooney: I’m not sure what you meant by that “interop in the face of a
    workable way to distinguish shells is a nonissue”. Are you saying,
    “Since there is a workable way to distinguish shells [file extensions
    in the Windows case], interop is a nonissue”? In other words, the
    “environment sharing” problem is a figment of my imagination? When you
    wrote, “this env sharing is a nasty thing and deserves to die”, do you
    mean “Anybody who does this deserves to be broken”? Telling people that
    they “deserve to die” is not a great way to win them over.

    Byron Ellacott: “[the cmd.exe interpreter] can be reimplemented from
    scratch in a fairly short time.” And then when Vista adds new features
    to the cmd.exe interpreter, your reimplementation stops working (since
    it doesn’t support the new features that new Vista batch files rely on).

  54. Andy C says:

    "If everybody is in agreement that a new scripting language doesn’t have to solve everything, then why do people keep touting each new command shell as the greatest thing ever that will kill batch files dead?"

    You’re absolutely right Raymond. It’s like that Windows thing, everyone says that will kill DOS stone dead but the fact it works slightly differently I’m sure means we’ll all still be using it for years to come…

    *sigh*

    Will all those batch files go away overnight? No. Are sysadmins like myself crying out for something less of a mess than batch files but less hard work than vbs? Absolutely. Batch files are horrid to write and even worse to maintain, not as bad as Perl but close.

    A frankly minor interop issue between old and new scripts doesn’t lessen the value of a decent shell at last.

  55. junfeng says:

    Just leave the batch file alone. People will write script for the new shell.

    Unix has so many shells. I don’t see people complain about interoperability.

  56. cheong00 says:

    Sorry for being OT, but I’d like to ask: Is there any conclusion reached for how to handle the Samba bug in fast mode? I really want to know what handling will be made regarding that issue as I’m running 2 or 3 Samba file servers.

  57. Cooney says:

    Raymond: I’m not sure what you meant by that "interop in the face of a workable way to distinguish shells is a nonissue". Are you saying, "Since there is a workable way to distinguish shells [file extensions in the Windows case], interop is a nonissue"? In other words, the "environment sharing" problem is a figment of my imagination? When you wrote, "this env sharing is a nasty thing and deserves to die", do you mean "Anybody who does this deserves to be broken"? Telling people that they "deserve to die" is not a great way to win them over.

    I’m saying a couple of things:

    1: file extensions are a bad idea. Everything should be a .cmd. Extend cmd.exe to recognize alternate interpreters and launch them, then leave it be.

    2: nobody needs interop with cmd, since it’s still there.

    3: environment pollution is a bad idea and isn’t worthy of being supported outside of cmd. It’s reminiscent of the poor isolation inherent in windows that leads to many of its current problems.

    4: environment sharing is indeed a misconception of yours. It does not exist as a requirement for new shells: all they must do is provide a compelling experience, not implement a superset of cmd.

    5: anybody who doesn’t agree with me and likes cmd just fine will not be affected in the least, as I intend no changes to cmd. It’s a relic and offers nothing I haven’t had elsewhere for a decade.

    I’d like to add a couple more things:

    I like unix, mainly because I can so everything from the cmd line. This is important when screwing with one or more servers sitting on the other side of an ocean.

    Windows would do well to emulate the parts of this that are useful – remote admin without a mouse or gui is a powerful thing.

    > Telling people that they "deserve to die" is not a great way to win them over.

    Final dig: I tell people that the feature is useless, not that they should die. In fact, I am even more polite than that: I say that the benefits do not justify the concomitant problems.

  58. orcmid says:

    Wow, I’m almost sorry that I read all of these comments.  

    My sense is that batch files (and command-line utilities that accept response files) will be around for a long time.  For one thing, VS 2005 still produces them and uses them to accomplish builds.  A lot of Platform SDK material uses them, as do other language-tool examples.  

    I also find batch files very handy when sharing code and builds.  You can count on them being usable, and you can easily set them up so that customization to a particular configuration is straightforward.  

    I expect nmake to be around for the same reason, even though VS 2005 has a "better way."

    In addition, before we get to PowerShell, there’s already WSH and the assured-to-be-available scripting languages JScript and VBScript.  (I just saw where there is an Ajax-oriented kit that will pack up a JavaScript into an .exe for easy deployment.  I can dig that too! http://ajaxian.com/archives/javeline-deskrun-run-ajax-apps-as-native-windows-programs)

    So I don’t think I will worry about PowerShell very soon.  

    Also, my approach to forcing toolcraft on others is to always share code using the lowest-level scripts that I know will work on the platforms I am sharing code for, and that’s usually *.bat and cmd.exe.  I don’t mind people using alternative shells (I fancy 4DOS and 4NT myself), including bash, other *nix shell flavors, REXX, and PowerShell. I just don’t want to have that mean I need to run them too.  (The number of different scripting and build tools it can take to build a single *nix application is ridiculous.)

    I figure the next step up from *.bat is *.js or JavaScript hosted in a local web page (or a *.hta if I’m rally feeling lucky).

    At bottom think it is basically a big 80-20 thing.  Simple tools for simple purposes win with me.

  59. Jonathan says:

    AC: to set vars permanently, use setx.exe (available in Win2003, and above I guess). It’s the same like setting it in System Properties.

    When I do need to edit it manually, I always copy the path for the (ridiculously small) edit box into notepad, edit and then copy back.

  60. AC says:

    Thanks Jonathan, I’ve found SetX for W2K. I’ve also found the documentation for changing the environment variables on system level:

    "To programmatically add or modify system environment variables, add them to the HKEY_LOCAL_MACHINESystemCurrentControlSetControlSession ManagerEnvironment registry key, then broadcast a WM_SETTINGCHANGE message."

    which is what SetX probably uses. But I still don’t know how USER defined variables can be accessed.

    Regarding PowerShell, now the search machines are updated, that’s the new name since Tuesday for Monad/Msh. I didn’t google on news.google.com lats time. :) Btw — don’t you hate the term "experience" used as a  obligatory word in MSFT materials? Marketspeak^2.

  61. oldnewthing says:

    Centaur: What if multiple scripts "bar1", "bar2", "bar3" all want to use foo1.cmd to set common variables such as the PATH?

    Cooney: Okay, so you’re agreeing that "A new scripting language doesn’t solve everything" and are taking the option in the second paragraph: "Maybe you decide that you won’t and prefer to force people to rewrite all their batch files into your new language. Good luck on that."

    I think a lot of the misunderstanding is that people are going for the second-paragraph option without saying so.

  62. Centaur says:

    The whole idea to call a script that will set environment variables is a design antipattern.

    *Context:* A script FOO needs an unspecified set of environment variables.

    *Proposed solution:* The script FOO calls another script BAR which sets the variables and returns.

    *Forces:* environment changes propagate down to child processes and in the same process but not up to parent processes.

    *Resulting context:* Both scripts have to run in a single process, since otherwise environment changes made by BAR will not propagate to FOO.

    *Therefore:*

    Invert the parent/child relationship between FOO and BAR. Have BAR call FOO as its last command. If the environment set up by BAR is going to be used by more than one script FOO1, FOO2, …, FOOn, have the continuation command passed to BAR in command line parameters.

    ::: foo1.cmd

    if "%SOME_VAR%" == "" goto noenv

    :: do the work

    goto end

    :noenv

    bar.cmd %0 %*

    :end

    ::# foo1.cmd

    ::: bar.cmd

    set SOME_VAR=hello world

    set SOME_OTHER_VAR=polo

    %*

    ::# bar.cmd

  63. Stefan Kanthak says:

    AC:

    [HKEY_CURRENT_USEREnvironment]

    "Name"="Value" ; REG_EXPAND_SZ works too

    then broadcast a WM_SETTINGCHANGE message.

    But: running instances of CMD.EXE don’t get the change, neither for SYSTEM nor for USER variables.

  64. microbe says:

    : Although, I’m now wondering, does cmd.exe pass it’s environment "up the process tree"?

    I don’t think anyone is passing env back to parents. In the example Raymond gave, he runs multiple .bat files inside one cmd.exe. So it seems in this case .bat file doesn’t spawn new processes, instead it’s interpreted inside the current cmd.exe directly, so they share the environment.

    Don’t know if it’s the case for MSH too..but if you run .bat inside MSH and MSH has to spawn a cmd.exe, apparently you can’t share env among multiple .bat (unless you pass multiple .bat to the same cmd.exe in one commandline).

  65. oldnewthing says:

    "You rewrite your scripts of you want to use bash. don’t if you want to keep cmd."

    And then your cmd batch scripts that try to share environment variables stop working when run from your alternate shell. That’s my point.

    "You’re acting as if there can be only one shell."

    Quite the contrary. There’s cmd.exe and there’s "alternate shell" and the interop boundary is where the problem is. If you decide that you want your alternate shell to be able to run batch files, then you have the environment variable problem.

  66. BryanK says:

    Cooney:

    > The compatibility break is this: you don’t

    > then copy the env back out on exit.

    But cmd.exe *does not do this*.  If you have a batch file that calls another batch file, the second file does not run in a new process.  You know Unix, where each shell script runs in its own process unless the calling script says otherwise; cmd.exe simply doesn’t work that way.  It runs every called batch file in the same process.

    > you should only be able to do that sort of

    > thing inside of the same cmd interpreter,

    > such as a bash script sourcing another bash

    > script.

    You *can* only do that sort of thing inside the same cmd interpreter; it’s just that "inside the same cmd interpreter" has always been the default.

    Yes, this is broken (I fully agree with your point #3 above), but unfortunately it’s the way it is, and people have made (IMO stupid) script systems based on this misfeature.  And Microsoft doesn’t think it can say "well that’s just not going to work anymore".  (For that matter, they’re probably right; they probably can’t.)

  67. Dewi Morgan says:

    Adam’s a-h list is correct, and the only way that anyone would want or expect it to ever work.

    This is, after all, the way that cmd.exe does it: if you call cmd.exe from within cmd.exe, then the "inner" shell will not pass its environment to the calling shell. It was ever thus, by design.

    The problem appears to be that .bat files will not affect the environment in the new shell, which will upset interactive users of the new shell, attempting to use their old library of interactive batch files.

    Carlos and others suggested a workaround which was basically associating batch files with a cmd.exe wrapper that stored the environment on a per-shell-PID basis, rather than with cmd.exe itself. Downside there is that when you run it from windows explorer and all other shells in the universe, you don’t expect or want the previous environment to be stored, and storing it may break scripts designed for use by clicking in explorer or running from any non-cmd.exe shell. Which means you need a separate association for .bat files in monad compared to the association in every other shell you might use, so that monad and cmd.exe can run .bat files and look the same.

    This is blatantly silly. Here in win2k, I have cmd.exe and command.com. They are incompatible, and batch files from one will not run in the other (though cmd.exe is MOSTLY backwardly compatible). We have command.com because cmd.exe is NOT always backwardly compatible. For people who use batch files with sideeffects, there will always be cmd.exe and command.com, just as there will alwys be notepad.exe in two places, and moricons.dll. They’re the historical compatibility tax.

    I personally will continue to use cmd.exe for commandline utils, because I prefer it.

  68. Cooney says:

    And then your cmd batch scripts that try to share environment variables stop working when run from your alternate shell. That’s my point.

    And my point is that that’s too bad – if you want to share environments, use cmd. There’s no reason to restrict the use of shells because they don’t play well with cmd.

    > If you decide that you want your alternate shell to be able to run batch files, then you have the environment variable problem.

    If you run bash, you don’t expect batch files to mess with your env. There is no problem because interop is never promised.

  69. oldnewthing says:

    "interop is never promised."

    So you’re back in paragraph 2 after all. You don’t have 100% support for classic batch files. People who want to use their classic batch files may have to rewrite them in your new shell language.

  70. Cooney says:

    So you’re back in paragraph 2 after all. You don’t have 100% support for classic batch files. People who want to use their classic batch files may have to rewrite them in your new shell language.

    Yes I do. If they want 100% compatibility, cmd.exe is still there. I’m proposing something new, which requires changes if you want to use it. The value proposition will hopefully be enough that some people will switch. If people with a large investment in cmd files don’t switch, so what? It’s a shell.

  71. Mark Steward says:

    Dewi: I agree with this, and my feeling is that a method for running batch files in "persistent" mode is the only effective solution.

    However, I’d say the problem with persisting environment variables is like cmd.exe not supporting pipes (and command.com continuing to).  It would fundamentally break a number of important uses for batch files, and forcing people to use command.com for pipes would make it difficult to take advantage of the new features in cmd.exe.

    This is basically about interfacing.  Yes, you can run all your interwoven batch files in one instance of cmd.exe, but you then need to get output from that back into whatever scripting language you’re using to call it.  And you need to plan that carefully – what if you want to change the order the batch files are called?  You need someone who can rewrite the legacy batch file interface.

    Of course, I’d prefer rewrite the batch files, but there are only so many hours in a day – I still have some up-to-date code that’s built with batch files, because getting VS to build it is too much work.

  72. Carlos says:

    Dewi Morgan: you misrepresent me.  I’m suggesting that the new shell implement the wrapper for batch files.

    So cmd.exe is unchanged, and existing behaviour is unaltered.

  73. Fox Cutter says:

    Well, if I had to solve this I would do something like this.

    * Take advantage of the fact that the child process can get a copy of the parent’s environment.

    * When launching a batch file I would do more then just run Command with the batch file. I would run it with a custom batch file that has three steps.

    1: Run a small app that will write out a copy of the environment to a temp file (file name probably based on the process ID).

    2: Run the batch file

    3: Run a second app that will create a delta of it’s current environment (given to it by cmd and having all the environment variable changes) with the temp file, and write out all the changes to a final temp file.

    * In the new shell, once cmd returns it can read the file with the delta and update it’s environment as needed.

    Is this a simple and clean answer… hardly. Will it work, I give it fairly high chance of working correctly 95% of the time.

    The perfect world solution with be to tweak cmd to report any environment changes to a different application (via a pipe or something) but that not really a realistic solution. The last thing you want is Power Shell installing a different (and potentially older) version of cmd.

  74. Russ says:

    IMO, everyone is missing the real problem here. There should be an easy way for environment variable changes to be propogated to parent processes and/or globally. There currently is not, hence lots of problems…

  75. Mark Steward says:

    Russ: Yes, there should be a way to propagate variables to the parent, but how, and when do you stop?  Global variables don’t scale very well.

  76. Cooney says:

    Bryank

    > But cmd.exe *does not do this*.  If you have a batch file that calls another batch file, the second file does not run in a new process.

    If you want to maintain compatibility with CMD, you must do this when writing another shell.

    > Yes, this is broken (I fully agree with your point #3 above), but unfortunately it’s the way it is, and people have made (IMO stupid) script systems based on this misfeature.  And Microsoft doesn’t think it can say "well that’s just not going to work anymore".  (For that matter, they’re probably right; they probably can’t.)

    I’m not saying that. I’m saying that other shells can behave differently. CMD can do its own thing, and bash does its own thing.

    Raymond:

    > "Maybe you decide that you won’t and prefer to force people to rewrite all their batch files into your new language. Good luck on that."

    No, no, no. You rewrite your scripts of you want to use bash. don’t if you want to keep cmd. You’re acting as if there can be only one shell.

    microbe:

    > I don’t think anyone is passing env back to parents.

    The only reason for this is to maintain the env sharing misfeature when interfacing two different shells. When you think about this, it’s such a bad idea that hopefully nobody will try it.

  77. Tony says:

    Raymond,

    Really good to see your support of another Microsoft Product.

    Your pompous attitude is really starting to get tiring.

  78. Adam says:

    Russ> "There should be an easy way for environment variable changes to be propogated to parent processes"

    And how is that supposed to work?

    Process A spawns child process B. At process creation time, B gets a copy of A’s environment. Process A continues to run. Process A changes its current working directory and PATH environment variable because it needs to. Process B then changes its cwd and PATH, does a few things and exits.

    At what point do you propose copying Bs changes back to A, and how do you expect to stop A breaking because of it?

    Processes should *NOT* be able to modify data in each others address space. Period. Environment vars are data in a processes address space. If you disagree with this, please go read a book on OS design.

    If you want to add a function so that a process can READ a child’s environment (which might be sane, but I’m not completely sure about that – imagine an unpriveliged process reading the environment of a more priveliged child), then a particular parent might choose to read its child’s environment after it notices that the child has terminated, but before its resources are released by that parent (i.e. – while it is in a zombie state).

    I’m not convinced it’s a good idea though. Imagine a parent that runs two child processes. The envrionment the parent ends up with will depend on which child finishes last. Uurgh.

    No, auto-propagation of env data up the process tree is a bad idea.

    Russ> "and/or globally."

    WHAT!?!

    So, if two people are logged onto the same computer, either with fast user switching, or on a terminal server, you want changes in one person’s environment to be propagated to the other? Are you kidding me? You want to be able to change someone else’s PATH while they’re logged on and potentially in the middle of some operation?

    Um, NO! Talk about a security risk. Geez…

  79. ChrisR says:

    @Tony

    I have a perfect solution for you if you are getting tired of Raymond’s attitude:  stop reading his blog.  See it’s nice and easy, now run along.

  80. Neil says:

    Sure, the new shell won’t see environment changes made by your .bat file. Seeing as you’re already manually running a bunch of .bat files, what’s so bad with manually starting cmd.exe first?

    Alternatively, if the .bat files call each other automatically they’ll all be running under the same cmd.exe anyway.

    If you really need to pass environment changes back to the parent then some sort of eval works e.g. eval ssh-agent -s or eval tset -s

  81. Craig Matthews says:

    <i>Adam: In other words, you’re willing to accept that batch files from msh don’t quite work right. That’s a fine decision, though not everybody will agree with it. </i>

    Unless Microsoft is going to force people to use MSH and send out shock troops to delete cmd.exe, I don’t understand what the problem is.  If I’m running a .CMD file, why would I even think I’m supposed to be able to execute that properly from MSH?  Why wouldn’t I just run it via cmd.exe?  

    It <i>is</i>, afterall, appropriate for me to use the command shell that my batch script needs … isn’t it?

  82. oldnewthing says:

    Neil: Sure, you can do that, but it means that if you have a set of batch files, one of which prepares an environment ("setenv.cmd"), and others which use it ("co.cmd", "buildit.cmd") – then you have to stay in cmd.exe once you’ve run the first one. You can’t use your alternate shell any more.

    $ cmd.exe

    C:> setenv

    Environment set for development.

    C:> co myfile.c

    myfile.c checked out to you.

    C:> edit myfile.c



    C:> buildit

    myfile.exe built successfully.

    C:> exit

    $ (back to original shell)

    Is this acceptable? Is it acceptable to have to use some "eval" magic to re-import the modified environment variables?

    Craig Matthews: The problem is if you decide to want to use some alternate shell as your primary shell, and then you run some batch file (via cmd.exe) and it doesn’t work [see above]. Is this (1) a bug, or (2) a feature?

  83. steveg says:

    An idea, choose your preferred default behaviour, add switch to cmd.exe to turn it on or off.

    Seems quite likely that any new shell language is going to be calling BAT files within the first few minutes of its release.

    I gather some readers would be amazed at rather large amounts of BAT code that exists. Heck, I’m amazed at the amount of Cobol code that’s still running.

  84. jachymko says:

    I see no problem using both the old and the new thing inside each other. I deleted my old environment-preparing batch file today. Its last command used to be PowerShell -NoLogo -NoExit, so the environment was inherited by PS, which was in turn used to launch other legacy batches.

    And BTW, I dont plan to move everything to PS. cmd.exe is still better for simple static configuration scripts without too much logic (lists of setx, reg import, setup.exe)

  85. Dave W says:

    The solution here seems clear to me: each shell needs a command that will invoke a child interpreter giving it effective write-access to the parent’s environment.  That way the parent is in control of whether its environment gets modified or not (by default, not, except for within cmd).  This could be implemented using the wrapper techniques discussed above.  So you could have something like:

    my_shell % env_copy cmd setvars.bat

    # setvars gets to modify my_shell’s environment

    my_shell % env_copy cmd cmd1.bat

    # cmd1.bat needs to pass an environment

    # variable along to the next script.

    my_shell % cmd cmd2.bat

    # any changes made by cmd2.bat don’t propagate

    # back to here.

  86. oldnewthing says:

    I forgot to mention that co.bat changes environment variables (set FILES_CHECKED_OUT=1) that buildit.bat reads (if %FILES_CHECKED_OUT%==1 set BUILD_FLAGS=-DPRIVATE_BUILD). If you run co.bat from PowerShell, the modified FILES_CHECKED_OUT variable won’t be passed to buildit.bat, causing your build to be marked as "built from current sources" rather than "build contains private changes".

  87. Skywing says:

    Actually, razzle is Microsoft internal and not available on the DDK last time I checked.  Such tools are not meant for us mere mortals…

  88. Adam says:

    Raymond> "then you have to stay in cmd.exe once you’ve run the first one. You can’t use your alternate shell any more."

    How does that follow?

    If I am in cmd.exe and run setenv.bat, then it should set up the environment properly.

    As already metioned, from here I should be able to run PowerShell shell scripts, and they should pick up the environment vars from cmd.exe, including the ones set by setenv.bat. Are we at least in agreement here?

    More importantly, if I run an interactive PowerShell shell from that cmd.exe, then I’ll be in the new shell, with all the command-line goodness that the new shell has, and I’ll have inherited all the environment variables from the parent cmd.exe process, including the ones set by setenv.bat.

    I could also run co.bat and buildit.bat – old batch files that have not been converted yet – from that PowerShell, as the cmd.exe subprocess will inherit its environment from that shell, which will include the setenv.bat vars that it inherited from *its* cmd.exe parent.

    If I’ve missed something, please explain in small words. I still don’t see where the problem is that you’re describing.

  89. Just so I understand this, the problem is that having cmd.exe be the handler for .bat files will break because Monad (a new Windows shell?) will run cmd.exe as its own process, and that means it’ll get its own copy of the environment which then gets thrown away (with any modifications being lost) when the process ends? And this didn’t happen before Monad because running a batch file from an existing cmd.exe would *not* start a new cmd.exe but instead run it in the current one?

    If that’s the case, is it possible to give batch files special handling? Have monad.exe execute batch files itself, but the code in monad simply execs a new cmd.exe to run foo.bat, grabs its environment before it exits, and applies that environment to the running monad.exe (as Carlos suggested up the top in http://blogs.msdn.com/oldnewthing/archive/2006/04/27/585047.aspx#585084).

    Any other shell that wants to implement this special "in-process running" for batch files can then do the same thing. I don’t know, however, if it’s possible for process M (Monad) to exec a process C (cmd.exe) and then peek at its environment.

  90. BryanK says:

    Stuart:  See Adam’s post here:

    http://blogs.msdn.com/oldnewthing/archive/2006/04/27/585047.aspx#586484

    for problems with that approach.  ("Imagine a parent that runs two child processes. The envrionment the parent ends up with will depend on which child finishes last.")

    Yes, it might be possible to figure out the environment *changes*, and somehow apply both sets of changes from both sub-processes, but what do you do when you see a conflict?  (E.g., all sub-processes change the same environment variable.  Which new value do you use, the one from the process that exited last?  Yikes.)

  91. Jules says:

    — Quote —
    Russ> “There should be an easy way for environment variable changes to be propogated to parent processes”

    And how is that supposed to work?

    Process A spawns child process B. At process creation time, B gets a copy of A’s environment. Process A continues to run. Process A changes its current working directory and PATH environment variable because it needs to. Process B then changes its cwd and PATH, does a few things and exits.

    At what point do you propose copying Bs changes back to A, and how do you expect to stop A breaking because of it?

    Processes should *NOT* be able to modify data in each others address space. Period. Environment vars are data in a processes address space. If you disagree with this, please go read a book on OS design.</i>
    — End quote —

    Obviously it doesn’t work in this case.  Clearly, the copying must only happen in a small subset of cases where it would work.  To this end, it should only happen where process A has specifically requested the behaviour.

    — Quote —
    Imagine a parent that runs two child processes. The envrionment the parent ends up with will depend on which child finishes last. Uurgh.
    — End quote —

    Clearly this is also a situation where the facility should not be used.  It only really makes sense to use it if process A stops what it is doing after executing process B and waits for process B to complete, as would be the case in the current example of a shell executing a different shell to interpret a script file in a language it didn’t understand.

    [Since you already presume that the two processes are already in cahoots, you can come up with whatever mechanism you like for the child process to signal the parent “Hey, here are some variables I want you to copy into your environment.” -Raymond]
  92. Bryan,

    Ah, I see the problem there. Would it not be easier to just not run .bat files in the background, but instead hang until they’re finished? Then you can’t run two at once. A switch added to monad’s "call" command that runs a .bat file in the background and returns immediately can then be used by people who don’t care about the environment issue.

  93. Adam says:

    Raymond:

    OK. In that particular case, yes, you can’t use PowerShell.

    You’ll just have to use cmd.exe. Oh no! How ever will you cope?

  94. Carlos says:

    @Adam and BryanK:

    "Imagine a parent that runs two child processes. The envrionment the parent ends up with will depend on which child finishes last."

    This is a non-problem.  At present, if you want to run two batch files in parallel you need to use two instances of cmd.exe and the environments aren’t shared.

    Since we’re trying to support existing behaviour there’s no reason for a new shell to act differently.  So when you launch a child batch file asynchronously you don’t use a wrapper and you don’t attempt to capture the resulting environment.  And everything works just as you’d expect.

  95. BryanK says:

    Hmm… true, I suppose.

    So basically you’d be importing the "source" builtin from Unix sh, except that the target would not have to be written in the same language as the source, because you have a way to communicate changes.

    How far would this extend?  Hopefully not to arbitrary EXEs that are supposed to run in a more-privileged child process (e.g. by calling LogonUser with a static password, and then whatever API starts the token impersonation)…

  96. Lee Holmes says:

    Hmmm, sorry I missed this conversation — my RSS reader was busted for a while and I didn’t realize it.

    I sense two messages here:

     – <Technology X> is not a silver bullet

     – Interop and backwards compatibility is hard.  New technologies ultimately break this at some point.

    I put some thoughts about the topic here: http://www.leeholmes.com/blog/NothingSolvesEverythingPowerShellAndOtherTechnologies.aspx

Comments are closed.