We are releasing .NET Core SDK usage data that has been collected by the .NET Core CLI. We have been using this data to determine the most common CLI scenarios, the distribution of operating systems and to answer other questions we’ve had, as described below.
As an open source application platform that collects usage data via an SDK, it is important that all developers that work on the project have access to usage data in order to fully participate in and understand design choices and propose product changes. This is now the case with .NET Core.
.NET Core telemetry was first announced in the .NET Core 1.0 RC2 and .NET Core 1.0 RTW blog announcements. It is also documented in .NET Core telemetry docs.
We will release new data on a quarterly schedule going forward. The data is licensed with the Open Data Commons Attribution License.
.NET Core SDK Usage Data
.NET Core SDK usage data is available by quarter in TSV (tab-separated values) format:
The Data
.NET Core has two primary distributions: the .NET Core SDK for development and build scenarios and the .NET Core Runtime for running apps in production. The .NET Core SDK collects usage data while the .NET Core Runtime does not.
The SDK collects the following pieces of data:
- The command being used (for example,
build,restore). - The
ExitCodeof the command. - The test runner being used, for test projects.
- The timestamp of invocation.
- Whether runtime IDs are present in the
runtimesnode. - The CLI version being used.
- Operating system version.
The data collected does not contain personal information.
The data does not include Visual Studio usage since Visual Studio uses MSBuild directly and not the higher-level .NET Core CLI tools (which is where data collection is implemented).
You can opt-out of telemetry by setting the DOTNET_CLI_TELEMETRY_OPTOUT variable, as described in .NET Core documentation.
Shape of the Data
The following is an example of the data you will find in the TSV (tab separated values) files.
You will notice misspellings, like “bulid”. That’s what the user typed. It’s information. Maybe we should implement the same kind of “Did you mean this?” experience that git has. Food for thought.
C:\dotnet-core-cli-data>more dotnet-cli-usage-2016-q3.tsv
Timestamp Occurences Command Geography OSFamily RuntimeID OSVersion SDKVersion
9/1/2016 12:00:00 AM 1 bulid India Windows win7-x86 6.1.7601 1.0.0-preview1-002702
9/8/2016 12:00:00 AM 1 bulid Republic of Korea Windows win81-x64 6.3.9600 1.0.0-preview2-003121
9/19/2016 12:00:00 AM 1 bulid United States Windows win81-x64 6.3.9600 1.0.0-preview2-003121
9/12/2016 12:00:00 AM 1 bulid Ukraine Windows win81-x64 6.3.9600 1.0.0-preview2-003121
8/12/2016 12:00:00 AM 2 bulid Netherlands Windows win10-x64 10.0.10240 1.0.0-preview1-002702
9/14/2016 12:00:00 AM 1 debug Hong Kong Windows win10-x64 10.0.14393 1.0.0-preview2-003121
9/14/2016 12:00:00 AM 1 debug United States Linux ubuntu.16.04-x64 16.04 1.0.0-preview2-003121
8/27/2016 12:00:00 AM 1 debug Belarus Windows win10-x64 10.0.10586 1.0.0-preview2-003121
9/16/2016 12:00:00 AM 1 debug India Darwin osx.10.11-x64 10.11 1.0.0-preview2-003131
8/31/2016 12:00:00 AM 1 debug Sweden Windows win10-x64 10.0.10586 1.0.0-preview2-003121
8/26/2016 12:00:00 AM 1 debug Netherlands Windows win10-x64 10.0.10586 1.0.0-preview2-003121
9/27/2016 12:00:00 AM 2 debug United States Windows win10-x64 10.0.10586 1.0.0-preview2-003121
8/2/2016 12:00:00 AM 1 debug Ireland Linux ubuntu.16.04-x64 16.04 1.0.0-preview2-003121
8/10/2016 12:00:00 AM 1 debug United States Windows win7-x64 6.1.7601 1.0.0-preview1-002702
8/18/2016 12:00:00 AM 1 debug United States Linux ubuntu.16.04-x64 16.04 1.0.0-preview2-003121
Data for .NET Core 2.0
The data that has been collected with the .NET Core SDK has demonstrated some important gaps in our understanding of how the product is being used. The following additional data points are planned for .NET Core SDK 2.0.
dotnetcommand arguments and options — Determine more detailed product usage. For example, fordotnet new, collect the template name. Fordotnet build --framework netstandard2.0, collect the framework specified. Only known arguments and options will be collected (not arbitrary strings).- Containers — Determine if the SDK is running in a container. Useful to help prioritize container-related investments.
- Command duration — Determine how long a command runs. Useful to identify performance problems that should be investigated.
- Target Framework(s) — Determine which target frameworks are used and whether multiple are specified. Useful to understand which .NET Standard versions are the most popular and whether new guidance should be written, for example.
- Hashed MAC address — Determine a cryptographically (SHA256) anonymous and unique ID for a machine. Useful to determine the aggregate number of machines that use .NET Core.
Product Findings and Decisions
This data has been very useful to the .NET Core team for a year now. In some cases, like looking at overall usage or at the usage of specific commands, we are very reliant on this data to make decisions. For more specific decisions, like the case of removing the OpenSSL dependency on macOS, we used the data as secondary evidence to user feedback.
Here are some interesting findings that we have made based on this data:
- .NET Core usage is growing — >10% month over month.
- .NET Core usage is geographically diverse — used in 100s of countries and all continents.
- .NET Core CLI tools are a very important part of the overall .NET Core experience — relative to .NET Framework, the CLI tools are novel.
- Developers do not use the .NET Core SDK the same way on Windows, macOS and Linux — the popular commands are different per OS.
- The publishing model for .NET Core apps is likely confusing some people — the difference in the popular commands suggests a use of .NET Core that differs from our guidance (more investigation needed).
- We have more work to do to reach out to the Linux and macOS communities — we would like to see increased use of .NET Core on thoses OSes.
- Our approach to supporting Linux (one build per distro) isn’t providing broad enough support — .NET Core was used on high 10s of Linux distros yet it only works well on 10-20 distros.
- There are gaps in the data that limit our understanding — we would like to know if the SDK is running in a container, for example.
We made the following changes in .NET Core 2.0 based on this data:
- .NET Core 2.0 ships with a single Linux build, making it easier to use .NET Core on Linux. .NET Core 1.x has nearly a dozen Linux builds for specific distros (for example, RHEL, Debian and Ubuntu are all separate) and limits support to those distros.
- .NET Core 2.0 does not require OpenSSL on macOS, with the intention of increasing adoption on macOS.
- .NET Core 2.0 will be easily buildable from source so that Linux distros can include .NET Core in their package repository/archive/collection. We are talking to distros about that now.
- We will attend and/or encourage local experts to participate in more conferences (globally) to talk about .NET Core.
More forward-looking:
- Fix the build and publishing model for .NET Core — the differences between
run,buildandpublishare likely confusing people. - Enable more CLI scenarios — enable distribution of tools, possibly like the way NPM does global installs.
.NET Core SDK Insights
The data reveals interesting trends about .NET Core SDK usage. Let’s take a look at historical data (since June 2016):
Note: this data is just from direct use of the CLI. There is of course a significant amount of .NET Core usage via Visual Studio, as well.
Command Variations by Operating System

There are some interesting and surprising differences in command usage between operating systems. We can see that build is by far the leading command on Windows, run on Linux, and restore on macOS. I’d interpret this to say that we’re seeing a lot of application development on Windows, maybe more “kicking the tires” applications on macOS scaffolded using Yeoman (since dotnet new usage is low on macOS), while Linux is primarily being used to host applications.
Note: The chart says “OSX”, which is the old name for macOS.
Weekly Trends

.NET Core usage is growing over time. You can see that there’s an obvious cycle that follows the work week.
Geographic Distributions

It’s interesting to take a look at the geographic variations in operating system usage. Most geographies have a mix, but you can see that some areas run predominantly on a single operating system, at least as it relates to .NET Core usage.
This data and visualization is based on the IP address seen on the server. It is not collected by the CLI. The IP address is not stored, but converted to a 3-octet IP address, which is effectively a city-level representation of that data.
Overall Operating System Distribution

Given .NET’s roots, it’s not surprising to see a large Windows following. It’s exciting to see substantial Linux and Darwin (macOS) usage as well.
Operating System Version Distribution

It looks like .NET Core is running mostly on the newest operating system versions at this point. This aligns with our expectation that .NET Core has been adopted mostly by “early adopters” to this point. In 2-3 years, we expect that the operating system distribution will be more varied.
More to Come
We will continue to make this data available to you in a timely manner, and we’re going to look into making it possible for you to visualize the kinds of trends we’re seeing (like in the images above). For now, we’re making the raw data available to you.
Thanks to everyone that has been using .NET Core. The community engagement on the project has been amazing and we are making a great product together. This information is helping us make the product better and will become even more useful in the future. We are now doing our part to make the data collected publicly available. This makes good on a statement that we made at the start of the project, that we would release the data. We now look forward to other developers reasoning about this data and using it as part of project decision making.

spelling. bulid
Good catch. I added a point on that just now.
That’s what the user typed. It’s a misspelling but not in a misspelling in the telemetry or the blog post.
So who’s using .NET Core from Antarctica?
I asked the same question. No idea. Two options: research scientists doing .NET Core development or bad IP address reporting.
People bored with endless snow? 🙂
I pretty much stopped using dotnet in March when I found out it was collecting this telemetry. I believe it’s wrong that tools spy on their users: https://opinionatedgeek.com/Blog/2017/3/26/your-tools-shouldn-t-spy-on-you
Spying is a good word for what’s happening here. There’s no guidance when you install, no prompt to ask you if this is all OK, just the sneaky sending of data that you may not even know about. If all you did was install and run the tool, you wouldn’t know it was send data to Microsoft. It’s hidden away unless you’re a fan of technical blogs.
Despite someone’s Github issue – https://github.com/dotnet/cli/issues/3093 – (over a year old and still running), despite someone else’s Pull Request – https://github.com/dotnet/cli/pull/7096 – switching telemetry off by default, we are in the situation where Microsoft now seems intent on making the tool’s spying even worse, all while talking about community engagement.
Because now, as well as gathering data whenever you run the tool, it’s going to capture and pass on a token to uniquely identify your computer. And what’s worse is this token can be discovered by anyone on your LAN. Worse still, the plan is for this data to be made public.
Want to know what your former colleague was doing before they left the company last month? Just find out their computer’s MAC address (from their network card), compute the SHA256 of it, and then you can search this data Microsoft is making public and see the commands they ran and when, right down to their typos. See? You can spy too now! (How soon until this telemetry is evidence in a lawsuit, I wonder.)
‘The data collected does not contain personal information.’?
And then there’s the opt-out mechanism. To stop the tool opening network connections I didn’t ask it to, or sending data I don’t want it to, I have to specify an environment variable. To ensure that’s done, I need to put that in the user initialisation of every shell of every user of every machine and every container that might possibly be running dotnet. And if I make one slip-up, the tool spies on me again.
But the problem is not the identifying token. The problem is not the publishing of the data. The problem isn’t the poor opt-out mechanism (for users who didn’t opt in in the first place!) The problem isn’t even the opting everyone in by default.
The problem is the normalisation of this spying. The drip drip drip of taking more information, combined with making it hard to configure the tool so it doesn’t spy on you. The problem is having to monitor everything about the tool because you can’t trust it. The problem is the attitude that says “We know you don’t want us collecting this information, so we’re not even going to ask you about it when you install.”
Without asking the user if it’s OK, there’s no informed consent. Taking data without informed consent is bad. Publishing data without informed consent is bad. It annoys me that I have to state these things.
I suppose that in the end it all comes down to the question I asked in March: “Would you prefer a tool you can just trust, or a tool that may have better features but that you constantly have to check to verify isn’t doing anything it shouldn’t?” I’d prefer a tool I can trust. Since March, dotnet has not been my preference. I prefer my tools Private By Default.
Thanks for the feedback.
The .NET Core installer tells you about data collection. It is intended to be clear and obvious. It also includes a link to .NET Core docs that explains how to opt-out. You can see the UI and the complete text in this gist: https://gist.github.com/richlander/1d2a21e3642b4fcf352c57f6f37ea154.
The data we shared doesn’t have any personal information in it. That won’t change in the future. We won’t share SHA256 hashed MAC addresses. As you suggest, that would not be a good approach.
The product has an opt-out policy in place. This approach enables us to understand how the product is used. As you can see from the blog post, this information has proved useful in prioritizing important improvements for the product. We will continue to invest in the product based on the insights we and others derive from the data.
We are doing .NET Core 2.1 planning now and will soon share what we are thinking for the new release, much of which is informed by this data.
That all said, I respect your viewpoint on telemetry and your choices.
🙁
The most disturbing thing is that you and Microsoft think this is alright!
This is just my opinion, but I think the vast majority of people are fine with the “drip drip drip of taking more information” as long as it leads to better experiences. I think we generally trust Microsoft, Google, Facebook, or whomever, not to do anything more malicious with our data than using it to increase their profits; they are commercial enterprises, after all, and if they don’t make money we don’t get the stuff they give us.
I don’t know what the metrics are on how many people join the SQL Server Customer Experience program, or similar, but I always click “Yes” for those things because if I trust these people to build the tools and platforms on which I’m building my business or running my life, why wouldn’t I trust them with the data they need to improve those things?
Wonder how many of then have been asked!
Nice to see some stuff happening in Africa. I know I’m one of the guys on the metric. 🙂
Nice! If you’ve got ideas on how we’d do a better job reaching out to developers in Africa or if development challenges differ in some way in Africa, I’d love to know. Conference suggestions? Send me mail at rlander@ms if want to chat more.
Greetings from Egypt, North Africa.
Expectation is misspelled in the “Overall Operating System Version Distribution” paragraph: “expecation that .NET Core”.
Fixed. Thanks!
I’m using asp.net core on a mac but building/starting is too slow. The speed is good after that, but this is really frustrating during development.
Do you have some charts on command duration?
Other languages like Go and Rust compile and start a lot faster.
I agree. I use Windows at work and Mac at home and on windows speed is faster than on Mac by an order of magnitude. Collecting metrics on this would help confirm this.
Interesting. I’d like to learn more about this. Is it commandine usage in both cases, Visual Studio in both cases or is it Visual Studio on Windows and commandline on macOS?
If you give me a bit more info, I’ll get my performance team to spend more time on this. Send me mail at rlander@ms if you’d like to share more details.
I’d venture to say its because you cannot precompile razor views outside of Windows, there is an open issue and a hacky workaround if you have access to a Windows machine, but it seems like priority of that issue should be really high for the asp.net core team. They pushed it to 2.1.0, but if you want to get people on board it needs to be in 2.0
In the ASP.NET Community Standup last week, Damian mentioned that one of the things they were focusing on for 2.1 was optimizing the development loop, so that build-and-run (and dotnet watch) will be a lot faster on all platforms.
> You will notice misspellings, like “bulid”. That’s what the user typed. It’s information. Maybe we should implement the same kind of “Did you mean this?” experience that git has. Food for thought.
Or maybe implement a GUI, which is intended to solve these very problems (and oh so much more). Food for thought. 🙂 🙂 🙂
If you are on macOS or Windows, we have a great GUI for you. It’s Visual Studio. On Linux, yes, you need to type these commands.
But yes, I can imagine what kind of GUI you would like. I love your enthusiasm!
Did you learn anything about performance? Especially the performance regression when migrating from project.json to MSBuild? There are numerous issues on GitHub tracking performance issues. Just wondering if the telemetry supports performance tracking also. Or maybe it’s just platform specific?
Currently, there is no performance information in our telemetry. The plan is to add duration to commands, which will add a first level form of performance information. That will be super helpful.
We also will start publishing docs on how to generate traces for performance investigations. We’d like to move to a model where you can send us performance traces when you think that the platform is doing something odd w/rt performance or that performance should just be better.
Right now, the .NET Core performance team is flying a bit blind as it relates to real-world performance. Certainly, we’ve made a bunch of performance improvements that will be helpful but we don’t know if they help >90% of apps or <10%. It can be hard to tell. So, the plan is to fix that!
The color switch between Windows and Darwin gave me a small heart attack in ‘Overall Operating System Distribution’ section
Sorry about that! Someone else made that point when I reviewed a draft post with a small set of community folks. I admit that I forgot to change it.
What you really should do, is to find the feedback channels that are consistent with the telemetry on concerns and issues highlighted.
Those are valuable feedback channels — prioritise them, and build two-way dialogue there.
We can do more here, for sure. I think that the macOS/OpenSSL case that I covered in the post is a good example. In that case, we saw a lot of negative feedback on the installation process and we also saw lower usage on macOS than expected. In that case, we were able to pair feedback and telemetry together — they matched. You are right that we need to do more of that. Now that the telemetry is published, we can share that same context when we create issues for product proposals.
Linux Devs who also target mono would also need to bypass the CLI and use MSBuild commands directly. All those devs do not factor in to that telemetry and should not be forgotten. It is also a shame that the cli is hardwired to be .net core only.
Certainly, we don’t want to forget those devs! The experience for building Mono/.NET Framework libraries and apps with the CLI isn’t great today. It’s something that I personally want to be better since I want to do it for my own projects. It’s already in my draft .NET Core 2.1 plan.
Yes please!
This info is really interesting. Usage for Darwin and Linux is higher than I would expect at this point.
Yes. We cannot fully reason about all the patterns and usage we are seeing in the data. The usage patterns are not the same on each OS, for example. I would have naively expected macOS/Darwin to be higher than Linux at first, but this is not the case.
One guess is that .NET Core has become very popular with Docker and that this is the reason for some much Linux/Debian usage. See the pull counts for .NET Core -> https://hub.docker.com/r/microsoft/. It’s not small. At present, we’re seeing about 1M pulls/month.
I would argue, that Linux could be underrepresented by the fact that more Linux users are reluctant to opting in to data collection. Even more so on production environments.
Could well be true. For your comment on production, there is no telemetry enabled (or even part of the product) in the runtime. Most people deploy with the runtime only in production so product as a whole is underrepresented. I should have made that more clear in the post. The entirety of this data is dev and build scenarios solely when using the CLI. All other scenarios are not part of it, so .NET Core total usage is much larger than this data suggests.
Can you please complete support for arm64?
Thanks,
Tony
I’d love to learn more about your need for that. ARM64 is very common in mobile and very early (AFAIK) on the server. Is it for one of those? Feel free to mail me at rlander@ms if you’d like to share more that way.
The most popular runtime ids are the ones provided in a MSDN blog post tutorial about pup listing self contained core apps. I wouldn’t read too much into it beyond that guys.
Very fair. It can be argued two ways: our guidance drives usage patterns, or we picked guidance that aligned with what people were most likely to want anyway. It is probably both.
So despite the huge Microsoft push for other OSes to use .NET Core, Windows remains the overwhelming usage scenario.
I think you’re taking exactly the wrong message from this. It doesn’t mean you need to devote *more* resources to other OSes, it means you guys should start focusing on Windows again (where your *paying* customers actually are).
Make WPF/WinForms platform specific libraries that are compatible with .NET Core (similar to how the Xamarin IOS/Android stuff works). Make a simple upgrade path from WPF to Xamarin Forms for Windows development. These are the kinds of things that would benefit your legions of Windows users, who have not had any substantial updates in their frameworks in 5+ years. “Modern” apps are, and probably always will be, inappropriate for most serious business applications because of their many restrictions and low information density.
That is a good point. Features do typically come to Windows first. Two good examples: RyuJIT x86 is still Windows-only and profile guided optimization of CoreCLR binaries came a whole release before Linux. On Linux and macOS, various parts of the experience don’t yet have parity with Windows. Installation experience is a good example, which is super important for growing a user base. That’s why we invested in removing OpenSSL on macOS and have been adding native installers for more Linux distros.
Separately, and I think this is your main point, is that there should be more investment in the UI platform. I agree with you. A group of folks has been looking at (generally, not specific to .NET Core). Actually, if you’d like to give them your direct feedback, I can hook you up. Mail me at rlander@ms.
Thanks, Rich. Those examples are valid, but I’d argue that they are both low-level and not something the dramatically impacts most developers day-to-day.
The UI stack itself is a big part, but not the entirety, of my point. When you go to make a new .NET project on Windows today you’re presented with WPF/WinForms/ASP.NET/Console on .NET Framework or Console/ASP.NET on .NET Core. This is a pretty crappy situation to be in, and I believe if the focus had remained on Windows and so much effort wasn’t being spent on x-plat, this wouldn’t still be the case.
It’s also not helpful that Microsoft continues to publicly insist that .NET Framework will continue to be developed in parallel, while every developer knows the writing is on the wall (which MS employees will privately admit) and that we need a viable upgrade path to .NET Core. We need a real, public, community-vetted plan of how the transition is going to happen.
In summary, IMO, .NET can and should be Windows-first; the dev experience for .NET Windows devs currently needs a lot of work, which I think should be the priority rather than getting .NET Core working on 5, 10, or 50 different Linux distros.
Happy to discuss anything further.
Are “Occurrences by RuntimeID” total user installs? How many installs on average per device?
I believe that is # of events (for example, “dotnet build” is a single event) per RuntimeID. So, given 1000 occurrences, there may be 1 user or 1000 users.
How much stuff going on in Nepal?? Trying to figure out but didn’t see clearly. I am the one working on .Net Core since its beginning.
Well, now that you have access to the data, you could write a simple tool yourself to figure that out! That was a big part of the motivation for releasing the data.
I would say MS got nothing helpful, just predictable data: Linux world more talks about “portability” than they do. Main MS target is desktop and they better NOT spread forces on marginal cr@p like Linux/MacOS/Solaris/etc.
I want to ask MS “smarties”: how it was possible 10 years ago people write programs without any telemetry at all?! And it was HIGH QUALITY programs! (opposite to modern MS cr@p) Answer is quite simple: nowadays MS has only indian dancers, not professional developers! That’s why MS moves absolutely wrong direction in .NET way.