PowerShell –EncodedCommand and Round-Trips

PowerShell.exe accepts the –EncodedCommand parameter, which is a way to ‘wrap’ DOS-unfriendly command strings in such a way as to be safely passed into PSH for execution.  It’s a great feature.  However, it has a huge documentation hole.  Let’s see what PowerShell.exe /? has to say about it:

    Accepts a base-64-encoded string version of a command. Use this parameter
    to submit commands to Windows PowerShell that require complex quotation
    marks or curly braces.

And, it has a helpful example:

    # To use the -EncodedCommand parameter:
    $command = 'dir "c:\program files" '
    $bytes = [System.Text.Encoding]::Unicode.GetBytes($command)
    $encodedCommand = [Convert]::ToBase64String($bytes)
    powershell.exe -encodedCommand $encodedCommand

That’s pretty useful, right?  Do you see the documentation hole?

It’s missing a way to convert it back. 

“Why would I need a way to convert it back?  It’s encoded, and it’s good, right?”

I’m sure it is.  However, from my SDET (Software Development Engineer in Test) background, I’ve learned not to trust software.  There’s the concept of “round trip” where the data is transformed, then transformed back, and the two instances of data had better be identical.

For purposes of discussion, here’s the reverse process.

$decodedCommand = [System.Text.Encoding]::Unicode.GetString([System.Convert]::FromBase64String($base64));

“But, PSH hasn’t given us any reason to distrust it, so why go through this trouble?”

Actually, they have.  Any guesses as to which one?


When you round-trip an object through:

$object | Export-CliXml –Path $path;
$object = Import-CliXml -Path $path;

…you end up with a feature drop: methods. CliXml persists only properties.  Now, I have no idea how to actually persist code functionality, but that’s not my point.  PSH, for all its crunchy goodness, does have areas where round-trip testing shows differences.

However, even if we couldn’t find a difference, software test best practices indicates we should at least evaluate the worthiness of round-trip testing, even if we don’t implement it.

Comments (7)

  1. FWN says:

    Hi Tim,

    sound advice – though I'm fairly confident in the fidelity of base64string conversion, taking it for granted might just bite you in the ass when you least expect it.

    Regarding serializing and deserializing objects with Export-/Import-Clixml:

    You can do this with a PSTypeConverter and adding some Powershell properties using type-extension. Here's an example extension XML:

    <?xml version="1.0" encoding="utf-8"?>























         <CodeProperty IsHidden="true">



             <TypeName><PSTypeConverter FullTypeName></TypeName>










    That's how the ExchangeShell does things also. It comes with some limitations on how your object may look (Private fields/properties will not be preserved, public properties may not throw an error when set (though the set needs not do anything)).

    Other than those limitations though it's fairly easy to use.



  2. timdunn says:

    Re: Export-CliXml – Much thanks!  

    Re: Deserialization – Recently, I found a use for this aside from round-trip testing..  Running a PSH as a scheduled task isn't very secure if the task is "powershell.exe pathtoscript.ps1" because someone can overwrite script.ps1 (or add stuff like send-mailmessage -subject $cred.Username -body $cred.GetNetworkCredential().Password…)

    A more secure way is to create it as powershell.exe { (do-this; do-that) -and $soForth }.  However, there's only so much punctuation before cmd.exe, which is tasked with interpreting the scheduled task, will throw up.  So, we use -EncodedCommand.  That's great, but how do we debug it when it fails a few months later?


  3. FWN says:

    Hi Tim,

    you can secure the path to make it harder. We store our task-scripts in a subfolder of program files. To change these you need admin privileges, and if you have local admin, you can take apart the task anyway (or read the credentials from the local system credentials cache).

    This is our compromise between security and maintainability.

    Because passing a script as encoded command comes with an issue:

    For lengthy scripts this can fail due to too long an application parameter. Not so much an issue on modern systems, it can be fairly crippling on Vista / 2008 / XP / 2003.

  4. timdunn says:

    Security vs. Convenience, the classic tradeoff.  The risk increases with the scope and ability.  We're applying SDL mindset to this, so this means PSH has a lot of scope and ability.  Which we already know. 🙂

  5. Simple SImon says:

    Security? heh.

    The malware writers have found this huge hole and are actively exploiting it via Task Scheduler on a regular basis. At the moment, they are relatively easy to find, but it's only a matter of time before they had more than the two current layers of obfuscation.

  6. George says:

    I just ran into the task scheduler exploit myself via this encoding stuff. It is strange in the first place, that I got a malware on my least used computer, but the scarier part is that one of the freaking scanning tool found it! It hijacked my DNS entries, so I realized the infection quickly, but to resolve it I had to dig myself into this mess. I am working with different security tools on Windows and Unix/Linux and the product quality of the so called cutting edge tools would be laughable if not the only purpose of those tools to protect you. I think security companies should start hiring real hackers again if they really meant to stay in business.

  7. timdunn says:

    http://www.businessinsider.com/john-mcafee-ill-decrypt-san-bernardino-phone-for-free-2016-2 has an interesting quote by John McAfee:

    And why do the best hackers on the planet not work for the FBI? Because the FBI will not hire anyone with a 24-inch purple mohawk, 10-gauge ear piercings, and a tattooed face who demands to smoke weed while working and won't work for less than a half-million dollars a year. But you bet your ass that the Chinese and Russians are hiring similar people with similar demands and have been for many years. It's why we are decades behind in the cyber race.

Skip to main content