What seems obvious today may have been impractical then


In the discussion of the environment variable problem, BryanK posits that the real mistake was allowing batch files to modify their parent environment in the first place. Instead, they should have run in a sub-process.

Try saying that when your computer has only 16KB of memory, which is how much memory the original IBM PC came with.

Heck, try saying that when your operating system doesn't even support sub-processes! It wasn't until MS-DOS 2.0 that the ability to run a process and then regain control after the process exits even existed. MS-DOS 1.0 followed the CP/M model wherein exiting a process freed all the memory in the computer (save for the operating system itself, of course; thank you, nitpickers) and loaded a fresh copy of the command interpreter. There were some checksum hacks to avoid reloading the command interpreter if it didn't appear to have been modified by the program that just exited.

Besides, if batch files couldn't modify the environment of the command interpreter, the AUTOEXEC.BAT file would be pretty useless.

Comments (25)
  1. SET BLASTER says:

    SET BLASTER=A220 I5 D1 H5 P330 T6

  2. Medinoc says:

    Actually, UNIX shell scripts seem to cumulate both philosophies: There is one command to execute a script in a new shell process (the default), and another to parse a script in the current shell process, usually to set environment variables.

    I have not tested, but it is possible that the second command not work if the caller has execute permission but not read permission on the script.

  3. bd_ says:

    @Mendinoc,

    There’s nothing particularly magical about shell scripts – the kernel just reads the shebang line, and passes the /filename/ to the interpreter listed. Thus, you always need read access to a shell script (unlike a bona fide executable).

  4. Barry Kelly says:

    @bd_,

    The second command Mendinoc is talking about is known as ‘.’ or ‘source’ in bash – and it doesn’t require a #! at the start of the file.

    Both read and execute permissions are required to execute a file using OS mechanisms, but only read permission is required for ‘.’ (aka ‘source’), since that’s a function of the interpreter, not the OS.

    — Barry

  5. John says:

    @Zathrus:

    Visual Studio 2005 lets you change the environment.  It’s probably been around in previous versions, but I don’t have them in front of me to verify that.

  6. Jonathan says:

    If you don’t want your batch file to change the env, use SetLocal.

  7. required says:

    I never understood the lack of functions in batch files (well, you could manage to use the call keyword instead, but you know, it is considered harmful). Maybe a function keyword could have been added as syntactic sugar around the call keyword.

    As result, lots of potential programmers were brain-damaged.

  8. Between call, variables, and labels, batch files were (and still are) extremely powerful.

    And for those who think that the 16KB was a typo: A popular contemporary computer was the Commodore VIC-20. It shipped with a full 5KB RAM (you only had access to 3.5KB, however). And, if you maxed out the RAM on a VIC-20, you had a whopping 21KB! :D (And people wonder where the "famous" 640KB quote came from! It was a massive amount of RAM back then! No one envisioned an IBM-compatible PC with > 4GB RAM back then!)

    Considering the memory "constraints", computers were far more efficient back then…

  9. ERock says:

    It’s really not fair, either, to say "Unix does it!" since, at the time, Unix systems were administered by very smart people who gave the users specific instructions on how to just get their work done. The Personal Computer was intended to be used by mere mortals.

  10. Cooney says:

    Besides, if batch files couldn’t modify the environment of the command interpreter, the AUTOEXEC.BAT file would be pretty useless.

    Nah, it’d just be called profile. of course, Unix tends to split machine specific stuff like ‘SET BLASTER’ from path manipulation that shells use.

  11. DriverDude says:

    Ahhh, forget about Win32 compatibility, or Win16, or DOS… we need to do it the way CP/M did.

    Even on Win64. :=)

    By the way, anyone know if CP/M runs in Virtual PC? (Does it need special PC/XT hardware?) I can’t find my copy now…

    "The Personal Computer was intended to be used by mere mortals."

    The PC was also affordable by mere mortals.

  12. Keithius says:

    @SET BLASTER:

    Oh geez, those were the days… ;-)

    *smiling fondly with nostalgia, sort of*

  13. Cooney says:

    Ahhh, forget about Win32 compatibility, or Win16, or DOS… we need to do it the way CP/M did.

    Or we could sideline the cmd.exe stuff and restart with a bash style interpreter. nothing gets broken and we get saner operation.

  14. Zathrus says:

    @bd_,

    Not quite. If you just invoke the script (e.g. — ./foo  or foo if it’s in your path) then it does use the shebang to figure out the proper interpreter.

    If, however, you ask it to source the script (. ./foo) then it will use your CURRENT shell to interpret the script — which may be a problem if the script was written for bash and you’re running ksh, etc. (or it’s not even shell, but perl, etc)

    Yes, MS has been stuck with the "source by default" for ages now because of how DOS 1.0 was implemented. But, to get back to the previous issue, you can certainly make Monad/PowerShell get environment changes from a shell, although it takes some hackery to do so (and I’ve seen this done in Perl to get env changes from a shell script) — you basically take the current environment, run the shell script, and then print out the shell’s environment prior to exiting. You can then parse the output and either just set your environment directly, diff the two, or do various other hacks. Yes, it’s ugly — but there’s no clean solution to this *IF* the script you’re executing is being handled by a different interpreter!

    The only other option to the above would be to violate child process memory protection and let you peek at their environment prior to killing the process. That’s a really bad idea in my opinion, since it violates all kinds of explicit security checks. Which is why Unix doesn’t do it either (see the first two paragraphs).

    As for other environment idiocy — why does Visual Studio *still* not let you change the environment for processes you’re going to debug? Why do I have to quit VS, change the environment (at command-line or via System Properties) and relaunch for such a simple thing?

  15. MS says:

    "Or we could sideline the cmd.exe stuff and restart with a bash style interpreter. nothing gets broken and we get saner operation."

    You mean something like PowerShell?  If you really want, you can get SUA and then you get all sorts of shells.

  16. JamesNT says:

    SET BLASTER….

    Good lord.  Why is it the first thing that came to my mind was configuring my AUTOEXEC.BAT so I could play 7th Guest?

    JamesNT

  17. Zathrus says:

    @John (and @ERock later):

    So after searching for some time on both Google and MSDN, it appears that VS2k5 can set environment variables, by using User Defined Macros on a Property Sheet (c.f. – http://forums.microsoft.com/MSDN/ShowPost.aspx?PostID=149399&SiteID=1 and http://msdn2.microsoft.com/en-us/library/a2zdt10t(VS.80).aspx ).

    Great. Except that in a basic Win32 Console Application I cannot figure out how to do it. Nor is it an option in the real projects I have, but since those are Windows Mobile apps, I’m not really surprised (or concerned; my previous position was cross-platform across Windows and 4 different flavors of Unix/Linux; it was a considerably larger issue then).

    "The Personal Computer was intended to be used by mere mortals."

    And you’re actually going to try to use this as a defense of Windows’ (and DOS’s) convoluted command-line/batch processing? Certainly some design decisions are forgivable (such as the one mentioned in the original post), but FOR’s weirdness?

    I’m deeply familiar with both Unix and Windows, and both have their advantages, but batch processing and automation has never been one of Windows’ strengths.

  18. Gazpacho says:

    I would guess that batch files didn’t support functions because Microsoft expected that anyone doing something that sophisticated would use BASIC or one of their other fine programming language products.

  19. Chris Oldwood says:

    Zathrus,

    Why don’t you launch Visual Studio with the correct environment in the first place, using teh /useenv switch on devenv.

    In a batch file, setup your PATH, INCLUDE, LIB etc (we use STLport, so it makes life easier) to create your "sandbox" and then start devenv like this,

    start devenv.exe /useenv solution.sln

    We rely on this to ensure that every developer is using the correct compiler, libraries etc.

  20. MadQ says:

    @Zathrus: In the project properties you can set environment variables under Debugging/Environment, and you can also choose to merge the debugee’s environment with the current one. It’s been there since like, forever.

    But even if that weren’t possible… to quote Raymond: "Don’t be helpless." Are you a programmer or what? VS add-ins run in-process. Add a command to set environment variables. The debugee inherits VS’s environment (unless you have Merge Environment set to No.) I can think of at least three other ways off the top of my head on how to set environment variables for a debugee.

  21. John Elliott says:

    @DriverDude:

    According to Ben Armstrong’s blog, you can run CP/M-86 under Virtual PC:

    http://blogs.msdn.com/virtual_pc_guy/archive/2004/10/16/243262.aspx

    The original CP/M was for the 8080 processor and if you want to run that under Windows you’ll need an emulator like MYZ80.

    Oh, and CP/M doesn’t have environment variables at all.

  22. GreaseMonkey says:

    Did the 5150 manage to run 8088 corruption with 16KB of RAM? The executable is 9.7KB, although I think the sound buffer is, basically, as large as it can be on a system. I’m assuming that DOS is in ROM, amirite?

    Oh, and by the way, the 8088 *DID* support multitasking. It wasn’t built into the CPU hardware, but it supported it through software.

    Hmm… checksum hacks… that would be fun…

  23. Anon says:

    "I never understood the lack of functions in batch files (well, you could manage to use the call keyword instead, but you know, it is considered harmful). Maybe a function keyword could have been added as syntactic sugar around the call keyword."

    I always implemented functions using recursion. IIRC something like this –

    @echo off

    if "%1"=="/function1_name" goto function1_name

    if "%1"=="/function2_name" goto function2_name

    call thisbatchfile.bat /function1_name arg1 arg2

    call thisbatchfile.bat /function2_name arg1 arg2

    goto exit

    :function1_name

    ; params are in %2, %3 etc

    goto exit

    :function2_name

    ; params are in %2, %3 etc

    goto exit

    :exit

    "As result, lots of potential programmers were brain-damaged."

    I don’t really understand this idea at all. It seems to be that if you’re using anything simpler than C on Unix there’s no hope for you.

  24. BryanK says:

    Yeah, it was completely impractical then, given the hardware that DOS had to run on.  Unix at the time was working on much, much better (…and more expensive) machines.

    That doesn’t mean it was the wrong way to do it, though — just that the right way was impractical, and Microsoft wasn’t interested in doing it right as much as they were interested in actually selling copies of the OS.

  25. By the way, surely there’s a fairly easy solution to the original problem – just get cmd.exe to run the .bat file followed by a program that dumps the environment to a file.

    (I believe the Gentoo Linux package management system, Portage, does something like this. It’s written in an odd mixture of Python and bash shell script, and is a general mess.)

Comments are closed.