Danbusha
Shared posts
A Real Estate Mogul Will Spend $100 Million to Fix Social Media Using Blockchain
Read more of this story at Slashdot.
The Biggest Nonmilitary Effort in the History of Human Civilization
Charles Fishman has a new book, One Giant Leap, all about NASA’s Apollo program to land an astronaut on the moon. He talks about it on Fresh Air with Dave Davies.
On what computers were like in the early ’60s and how far they had to come to go to space
It’s hard to appreciate now, but in 1961, 1962, 1963, computers had the opposite reputation of the reputation they have now. Most computers couldn’t go more than a few hours without breaking down. Even on John Glenn’s famous orbital flight — the first U.S. orbital flight — the computers in mission control stopped working for three minutes [out] of four hours. Well, that’s only three minutes [out] of four hours, but that was the most important computer in the world during that four hours and they couldn’t keep it going during the entire orbital mission of John Glenn.
So they needed computers that were small, lightweight, fast and absolutely reliable, and the computers that were available then — even the compact computers — were the size of two or three refrigerators next to each other, and so this was a huge technology development undertaking of Apollo.
On the seamstresses who wove the computer memory by hand
There was no computer memory of the sort that we think of now on computer chips. The memory was literally woven … onto modules and the only way to get the wires exactly right was to have people using needles, and instead of thread wire, weave the computer program. …
The Apollo computers had a total of 73 [kilobytes] of memory. If you get an email with the morning headlines from your local newspaper, it takes up more space than 73 [kilobytes]. … They hired seamstresses. … Every wire had to be right. Because if you got [it] wrong, the computer program didn’t work. They hired women, and it took eight weeks to manufacture the memory for a single Apollo flight computer, and that eight weeks of manufacturing was literally sitting at sophisticated looms weaving wires, one wire at a time.
One anecdote that was new to me describes Armstrong and Aldrin test-burning moon dust, to make sure it wouldn’t ignite when repressurized.
Tags: Apollo Apollo 11 computing Moon spaceArmstrong and Aldrin actually had been instructed to do a little experiment. They had a little bag of lunar dirt and they put it on the engine cover of the ascent engine, which was in the middle of the lunar module cabin. And then they slowly pressurized the cabin to make sure it wouldn’t catch fire and it didn’t. …
The smell turns out to be the smell of fireplace ashes, or as Buzz Aldrin put it, the smell of the air after a fireworks show. This was one of the small but sort of delightful surprises about flying to the moon.
Hackers Built a 'Master Key' For Millions of Hotel Rooms
Read more of this story at Slashdot.
NIX – No More Guessing about Hydration Levels
NIX is attempting to maintain optimal hydration levels with their one time use continuous stand alone biosensor which monitors sweat for hydration levels. The data provided informs an athlete when, how much and what to drink. NIX is looking is not only monitoring hydration, but also electrolyte loss to maximize performance. NIX has plans to monitor skin, oral health and neonatal nutrition going forward.
Spring Favorites 2017
“It is not enough that yearly, down this hill,
April
Comes like an idiot, babbling and strewing flowers.”
–Edna St. Vincent Millay “Spring”
The cacophony of the Trump era has made it hard for me to focus on much in the way of culture. The political news is too much with us, and it can easily be all-consuming. Dan prefers to avoid it as much as possible, while I feel the need to stay alert to each new threat. But even I need my breaks from it. Here are the things that have been reminding me of my humanity lately:
Reading:
Evicted. Matthew Desmond’s rigorous, thorough, and fascinating exploration of the rise in evictions in American cities and the ways that the lack of affordable housing disrupts lives and destabilizes neighborhoods. It’s a subject ripe for study, and it’s surprising that so little research has been done on evictions–especially in the wake of the housing bubble collapse. Desmond follows the stories of several households in Milwaukee as they struggle to find and keep one of the most basic of human needs, shelter. While, as you might guess, it’s a bit of a downer, the book feels urgent and insightful and I couldn’t put it down.
The Cross of Redemption. A collection of previously uncollected writings–mainly essays–from James Baldwin. At a time when democracy feels fragile and race relations fraught, Baldwin’s writing on his own experience as a black American through the middle part of the 20th century feels extraordinarily timely. But more than that, his writing is a profound pleasure to read. His sentences are sprawling, twisting, beautiful. His command of the English language is a thing to behold.
Jamie Lauren Keiles’s concise, erudite, and delightful ode to the squat.
Whet Moser’s examination of the proper temperature of streetlights as Chicago makes the change from orange high pressure sodium lights to much bluer and more energy efficient LEDs. I love diving into the aesthetic and practical issues of city maintenance. Also, the new LED streetlights have gone in by us and their brightness drives Dan crazy.
Jerry Useem’s look at dynamic pricing and the ways that the internet has changed and continues to change the relationship between buyer and seller in “How Online Shopping Makes Suckers of Us All.”
Margaret Atwood revisits ‘The Handmaid’s Tale’ in light of the upcoming Hulu series and the current political climate. A fascinating look back at the dystopian classic.
Ijeoma Oluo’s interview with Rachel Dolezal, the white woman who identifies as black. Oluo is excellent on context and brings an actual black woman’s voice to the discussion. I didn’t think I wanted to read anything more on the Dolezal sideshow, but I was so glad I read this one.
Watching:
Sneaky Pete created by Bryan Cranston and starring Giovanni Ribisi is a show for anyone with a weakness for charming grifters. And if you need more than that, Margo Martindale, one of the most delightful character actresses working today, has a supporting role. It’s streaming on Amazon Prime.
A Series of Unfortunate Events on Netflix is tremendously fun. It makes great use of Neil Patrick Harris’s theatrical and musical talents and the styling and casting is on point. We don’t have kids, and I’m not one to seek out children’s programming in general, but this one is absolutely recommended for all ages. After inhaling the series, Dan and I were singing the theme song for weeks.
Crazy Ex-Girlfriend. If you haven’t yet experienced the smart, manic, musical, oddball world of Rachel Bloom’s Rebecca Bunch, you’re missing out on one of the most creative and emotionally insightful shows on television. I mistakenly dismissed it at first, but I was so wrong. It’s streaming on Netflix.
Listening:
S-Town. If you’ve ever listened to This American Life or Serial, you probably know about this podcast juggernaut. If not, get thee to your nearest podcatcher and be prepared to be engrossed. Then you can read all of the commentary about it and decide for yourself if it’s overrated, exploitive, ignorant on parts of queer culture or insensitive in some other way. But regardless, it’s a fascinating story, reminiscent of a Southern gothic novel, and it’s totally worth your time.
The Hilarious World of Depression. John Moe’s podcast explores clinical depression through the lens of comedians who’ve experienced it. It’s a compassionate take on a subject that’s close to home for so many of us. I love the Dick Cavett episode.
You Must Remember This. Karina Longworth’s podcast about the secret and/or forgotten histories of Hollywood’s first century is a fascinating listen for anyone with an affection for old movies or a curiosity about how celebrity culture has evolved. I recently finished the series on MGM and the studio system. I especially loved the episodes on Eddie Mannix and Gloria Grahame.
Pod Save America, the political conversation from Obama bros Jon Favreau, Jon Lovett, Tommy Vietor and Dan Pfeiffer is a smart and funny take on the news of the day. When I’m ready to swim in the stream of current events, this is my favorite place to dive in. For unabashed liberals.
The Weeds, the podcast from Vox’s Ezra Klein, Sarah Kliff, and Matt Yglesias is a nerdy policy wonkfest. The trio go deep on NBER white papers and the detailed, wonky ins-and-outs of economic policy and healthcare policy and tax policy. I’ve long harbored a bit of resentment toward Klein and Yglesias for being a little bit younger than me and way more successful in a field I once aspired to work in, but even I can appreciate the work that they and Kliff have put in on understanding complicated policy issues and I’ve learned a lot from their discussions. In a world of tweets and whirlwind news cycles, it’s nice to have a thoughtful, detailed discussion of what effects proposed policies would have on real people.
The post Spring Favorites 2017 appeared first on Blossom to Stem.
A timeline map of the global median age from 1960-2060
From Aron Strandberg,1 this is a timeline visualization of the age of the world’s population from 1960-2060. The world’s human population has increased rapidly in the last couple centuries, most recently doubling since 1970:
A tremendous change occurred with the industrial revolution: whereas it had taken all of human history until around 1800 for world population to reach one billion, the second billion was achieved in only 130 years (1930), the third billion in less than 30 years (1959), the fourth billion in 15 years (1974), and the fifth billion in only 13 years (1987).
But watching that video, you’ll realize that the world’s population will not reach 20 or 30 billion in 2050 — human civilization is getting old.
-
Strandberg was also recently in charge of Sweden’s Twitter account, which they hand over to a random Swedish person each week. That’s where I found his chart.↩
Four of Iceland's Main Volcanoes Are All Preparing For Eruption
Read more of this story at Slashdot.
Announcing .NET Core Tools Updates in VS 2017 RC
Today, we are releasing updates to the .NET Core SDK, included in Visual Studio 2017 RC. You can also install the .NET Core SDK for use with Visual Studio Code or at the command line, on Windows, Mac and Linux. Check out the Visual Studio blog to learn more about this Visual Studio 2017 update.
The following improvements have been made in the release:
- Templates —
dotnet new
has been updated and now is based on a new templating engine. - The location of the .NET Standard class library template, in Visual Studio, has been moved to the new .NET Standard node, based on feedback.
- Quality — ~50 fixes have been made across the tools to improve product reliability.
The quality fixes have been made across the .NET CLI, NuGet, MSBuild and also in Visual Studio. We will continue to squash bugs as we get closer to Visual Studio 2017 RTM. Please continue sharing your feedback on the overall experience.
Getting the Release
This .NET Core SDK release is available in Visual Studio 2017 RC, as part of the .NET Core cross-platform development workload. It is also available in the ASP.NET and web development workload and an optional component of the .NET desktop development workload. These workloads can be selected as part of the Visual Studio 2017 RC installation process. The ability to build and consume .NET Standard class libraries is available in the all of the above workloads and in the Universal Windows Platform development workload.
You can also install the .NET Core SDK release for use with Visual Studio code or with the command-line use on Windows, macOS and Linux by following the instructions at .NET Core 1.0 – RC4 Download.
The release is also available as Docker images, in the dotnet repo. The following SDK images are now available:
1.0.3-sdk-msbuild-rc4
1.0.3-sdk-msbuild-rc4-nanoserver
1.1.0-sdk-msbuild-rc4
1.1.0-sdk-msbuild-rc4-nanoserver
The aspnetcore-build repo has also been updated.
Changes to Docker Images
We made an important change with this release to the tags in the dotnet repo. The latest
and nanoserver
tags now refer to MSBuild SDK images. The latest
tag now refers to the same image as 1.1.0-sdk-msbuild-rc4
, while nanoserver
now refers to the same image as 1.1.0-sdk-msbuild-rc4-nanoserver
. Previously, those two tags refered to the same image as 1.1.0-sdk-projectjson-rc3
and 1.1.0-sdk-projectjson-rc3-nanoserver
, respectively.
This is a breaking change, since the msbuild SDK is not compatible with the project.json-based SDK. We need to start moving the .NET Core ecosystem to the msbuild SDK, sooner than expected. We had originally planned to make this change at Visual Studio 2017 RTM. The number of times the latest
tag is being pulled is growing much faster than we expected, making the break worse with each passing day. As a result, we were compelled to make this change with this release.
You can continue to use the project-json images for now, listed below, to give you more time to transition to the msbuild images (see dotnet migrate). Changing to these, more specific, tags is a one line change in a Dockerfile.
1.1.0-sdk-projectjson-rc3
1.1.0-sdk-projectjson-rc3-nanoserver
Note: We are no longer updating the project.json images, so please do plan your transition to the msbuild images. For example, only the msbuild SDK images will be updated when we release the 1.0.4 and 1.1.1 runtime updates (we expect) later this quarter.
We apologize if you are broken by this change. We will be providing general guidance on how to best use our tags to avoid a similar situation in future. We’ve been learning a lot about Docker over the last several months, particularly around versioning and naming. Expect a blog post soon on this topic that addresses these issues.
Changes to Supported Linux Distros
Fedora 23 and openSUSE 13.2 recently went out of support, per their respective project lifecycle. As a result, we are now no longer supporting or building for Fedora 23 and openSUSE 13.2.
We will be publishing a more formal policy on Linux distro support, in particular on managing end-of-life of distros. There will be opportunity for feedback on the policy before it is finalized.
Project Files
In the RC3 release, we made major improvements to make the csproj project files smaller. If you are using .NET Core project files created with earlier Visual Studio 2017 versions (before RC3), you should read the Updating Project Files section of the RC3 blog post to learn about changes you need to make to your project files.
dotnet new
The dotnet new
command is one of the most important parts of the .NET Core tools experiences. It is useful for both new and experienced .NET Core users. I know that people who use and test the product on a daily basis use dotnet new
all the time for experiments and prototypes. I do! It’s also documented on a lot of websites and markdown pages to help users get started with .NET Core. That said, we always knew that dotnet new
was a little lacking and decided to improve it.
In short, we want dotnet new
to have the following characteristics:
- Powerful — expressive and scriptable command-line syntax.
- Helpful — an interactive mode helps users pick the templates they need (think Yeoman).
- Extensible — anyone can write templates for
dotnet new
! - Updatable — templates can be updated outside of primary delivery vehicles (e.g. Visual Studio, .NET Core SDK).
- Platform — can be used by tools like Visual Studio and generator-aspnet (think
yo aspnet
).
dotnet new
is now based on a new templating engine, which you can check out at dotnet/templating. It already does a great job satisfying what the RC3 version of dotnet new
did. We’ll continue to add to it and improve it over the next several months, getting it to the point that it satisfies all the characteristics above. For the immediate term, we’re focussed on ensuring that it has the right quality level for Visual Studio 2017 RTM.
Improvements
We have updated dotnet new
in the RC4 release with the following features:
You can now specify a target directory for your new template, with the -o
argument, such as in the following example: dotnet new console -o awesome-new-tool
. If the target directory does not exist, it will be created for you. This can also be combined with the -n
argument to name projects, such as in the following example: dotnet new console -n awesome-new-tool -o src/awesome
.
Target frameworks now have their own argument, -f
. You can specify a target framework for any template, provided it is a legal value, such as in: dotnet new console -f netcoreapp1.0
. The target framework values are the same as the ones used in the project files.
Solution file management has been improved. You can now create an empty solution file with dotnet new sln
and then add projects to it. You can create solution files before or after project files, depending on your preferred workflow. If you have been using the older project.json-based tooling, you can think of solution files as the replacement for global.json files.
Important Changes
The basic dotnet new
(no arguments) experience no longer default to creating a console template, as it did in RC3 and earlier releases. The dotnet new
command will now print the available set of templates, much like dotnet new --help
. In a later release, we may update dotnet new
to start an interactive new template experience, which helps you select the right template based on a series of questions.
The new commandline has been streamlined. To create templates, you type dotnet new console
or dotnet new web
for console app or MVC templates respectively. The RC3 and earlier tools versions required a -t
argument before the template name, such as dotnet new -t web
.
Some of the template names changed, specifically Lib
(now classlib
) and Xunittest
(now xunit
). For RC4, you will need to use the new template names.
Walkthrough of the new template experience
You are probably curious about the new dotnet new
experience. Sayed Hashimi, the Program Manager for dotnet new
, wrote the following walkthrough to give you a good idea of what to expect. That said, I encourage you to install the RC4 SDK and try it out for yourself.
Sayed’s walkthrough was done on Linux. You can replicate the same experience on Windows. Just make sure to replace the Linux commands with the ones you are using in your favorite Windows shell.
Getting familiar with the new new
First let’s get a little familiar with new
by displaying the help using the dotnet new --help
. The result is shown below.
$ dotnet new --help Template Instantiation Commands for .NET Core CLI. Usage: dotnet new [arguments] [options] Arguments: template The template to instantiate. Options: -l|--list List templates containing the specified name. -lang|--language Specifies the language of the template to create -n|--name The name for the output being created. If no name is specified, the name of the current directory is used. -o|--output Location to place the generated output. -h|--help Displays help for this command. -all|--show-all Shows all templates Templates Short Name Language Tags -------------------------------------------------------------------------------------- Console Application console [C#], F# Common/Console Class library classlib [C#], F# Common/Library Unit Test Project mstest [C#], F# Test/MSTest xUnit Test Project xunit [C#], F# Test/xUnit Empty ASP.NET Core Web Application web [C#] Web/Empty MVC ASP.NET Core Web Application mvc [C#], F# Web/MVC Web API ASP.NET Core Web Application webapi [C#] Web/WebAPI Solution File sln Solution Examples: dotnet new mvc --auth None --framework netcoreapp1.0 dotnet new mstest --framework netcoreapp1.0 dotnet new --help
From the help output we can see that to create a project we can execute dotnet new <templatename>
. The template names are displayed in the results of --help
but you can also get the names using dotnet new -l
.
Creating Projects
Let’s create a new HelloWorld console app. The most basic way to create a console app is using the command dotnet new console
. The other parameters that we can specify are listed below.
-n|--name
-o|--output
-lang|--language
In this case we want to create a C# console app named HelloWorld in the src/HelloWorld
directory. Since C# is the default language for the console app template (default value indicated in help by [ ]
) there is no need to pass a value to -lang
. To create the project execute dotnet new console -n HelloWorld -o src/HelloWorld
. The result is shown below.
$ dotnet new console -n HelloWorld -o src/HelloWorld Content generation time: 32.4513 ms The template "Console Application" created successfully.
Let’s see what was generated by listing the files on disk.
$ ls -R src ./src: HelloWorld ./src/HelloWorld: HelloWorld.csproj Program.cs
The HelloWorld project was created as expected in src/HelloWorld
, and it consists of two files HelloWorld.csproj
and Program.cs
. Let’s restore the packages and run the app using dotnet restore
and then dotnet run
. See the result.
$ cd src/HelloWorld/ $ dotnet restore Restoring packages for /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/HelloWorld.csproj... Generating MSBuild file /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/HelloWorld.csproj.nuget.g.props. Generating MSBuild file /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/HelloWorld.csproj.nuget.g.targets. Writing lock file to disk. Path: /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/obj/project.assets.json Restore completed in 953.36 ms for /Users/sayedhashimi/temp/blog/samples/src/HelloWorld/HelloWorld.csproj. NuGet Config files used: /Users/sayedhashimi/.nuget/NuGet/NuGet.Config Feeds used: https://api.nuget.org/v3/index.json $ dotnet run Hello World!
From the output we can see the packages were restored successfully and when the app was executed Hello World!
was printed to the console.
Templates with Options
Templates can expose options, which customize template output based on user input. We can see those options by calling --help
on a template, such as with dotnet new mvc --help
.
$ dotnet new mvc --help Template Instantiation Commands for .NET Core CLI. Usage: dotnet new [arguments] [options] Arguments: template The template to instantiate. Options: -l|--list List templates containing the specified name. -lang|--language Specifies the language of the template to create -n|--name The name for the output being created. If no name is specified, the name of the current directory is used. -o|--output Location to place the generated output. -h|--help Displays help for this command. -all|--show-all Shows all templates MVC ASP.NET Core Web Application (C#) Author: Microsoft Options: -au|--auth The type of authentication to use None - No authentication Individual - Individual authentication Default: None -uld|--use-local-db Whether or not to use LocalDB instead of SQLite bool - Optional Default: false -f|--framework netcoreapp1.0 - Target netcoreapp1.0 netcoreapp1.1 - Target netcoreapp1.1 Default: netcoreapp1.0
Here we can see that the mvc template has three specific parameters. In this case let’s create an mvc app named MyWeb in the src/MyWeb directory targetting netcoreapp1.1. To do that we will execute dotnet new mvc -n MyWeb -o src/MyWeb -au Individual -f netcoreapp1.1
.
$ dotnet new mvc -n MyWeb -o src/MyWeb -au Individual -f netcoreapp1.1
Content generation time: 429.6003 ms
The template "MVC ASP.NET Web Application" created successfully.
Now the project has been created in the src/MyWeb
directory. Let’s take a look.
$ ls -lp src/MyWeb/
total 80
drwxr-xr-x 5 sayedhashimi staff 170 Feb 3 10:43 Controllers/
drwxr-xr-x 4 sayedhashimi staff 136 Feb 3 10:43 Data/
drwxr-xr-x 5 sayedhashimi staff 170 Feb 3 10:43 Models/
-rwxr--r-- 1 sayedhashimi staff 1767 Feb 3 10:43 MyWeb.csproj
-rwxr--r-- 1 sayedhashimi staff 4096 Feb 3 10:43 MyWeb.db
-rwxr--r-- 1 sayedhashimi staff 544 Feb 3 10:43 Program.cs
drwxr-xr-x 5 sayedhashimi staff 170 Feb 3 10:43 Services/
-rwxr--r-- 1 sayedhashimi staff 3081 Feb 3 10:43 Startup.cs
drwxr-xr-x 8 sayedhashimi staff 272 Feb 3 10:43 Views/
-rwxr--r-- 1 sayedhashimi staff 168 Feb 3 10:43 appsettings.Development.json
-rwxr--r-- 1 sayedhashimi staff 185 Feb 3 10:43 appsettings.json
-rwxr--r-- 1 sayedhashimi staff 197 Feb 3 10:43 bower.json
-rwxr--r-- 1 sayedhashimi staff 604 Feb 3 10:43 bundleconfig.json
-rwxr--r-- 1 sayedhashimi staff 61 Feb 3 10:43 runtimeconfig.template.json
-rwxr--r-- 1 sayedhashimi staff 680 Feb 3 10:43 web.config
drwxr-xr-x 8 sayedhashimi staff 272 Feb 3 10:54 wwwroot/
Future Plans
We want to enable everyone to create templates and make it easy to share those templates. Templates will be installable as NuGet packages or a folder. In the mean time checkout out the templating wiki for info on creating templates. I was happy to see the Custom project templates using dotnet new
post by one of the community members that we’ve been working with for early feedback. Here’s my favorite quote from his post
“This new method makes creating project templates about as easy as it’s ever going to get and allows
really easy sharing, versioning and personalization of project templates.”.
We are also working on a way to enable templates to be updated. For critical fixes we are considering updating templates without any user interaction. For general updates we are looking to add a new --update
option.
We are working on plans to integrate the templating engine with the Visual Studio family of IDEs and other template experiences, such as Yeoman. We have a vision of everyone producing templates in a single format that works with all .NET tools. Wouldn’t that be nice!?! If you’re interested in learning more about how yo aspnet
relates to dotnet new
see my comments on the topic.
Last, we’re hoping to update the command line experience to be interactive. In this mode we will
prompt for things like the template name, project name and the other information that you otherwise need to provide as command line arguments. We believe that interactive is the ultimate new user experience.
Summary
I’ve been asked several times recently when the .NET Core Tools will ship a final RTM release. The tools will ship as an RTM release the same day as Visual Studio 2017 RTM. We’re getting close. As I said at the start of the post, we’ve got a few more bugs to squash first and then we’ll be happy to get the release out the door for you to use.
In this release, we’ve focussed on quality improvements. We also switched over to a new and more capable templating engine. In this release, the new dotnet new
implementation is largely a replacement of the functionality that was included in the RC3 release. In upcoming releases, you should expect to see some great new features that make you more productive at the command line. We hope to integrate this new system into Visual Studio, too, enabling us (and you!) to share templates across all .NET Core tools.
Thanks to Sayed Hashimi for the write-up on the new dotnet new
implementation!
As always, please shared your feedback, either in the comments, in email or twitter.
- For general feedback, please contact us at dotnet@microsoft.com or at @dotnet.
- For template users and authors, feel free to contact Sayed directly at sayedha@microsoft.com at @SayedIHashimi
Thanks!
Can Learning Smalltalk Make You A Better Programmer?
Read more of this story at Slashdot.
Louis C.K. on the 2016 election
On Conan last night, Louis C.K. had some things to say about the 2016 presidential election.
Tags: 2016 election Louis C.K. politics TV videoIf you vote for Hillary you’re a grownup; if you vote for Trump you’re a sucker; if you don’t vote for anyone, you’re an asshole.
Breakfast
A quick short film about how different people in LA approach breakfast, from the woman who nibbles to the professional bodybuilder who eats 30 egg whites and a cup of grits on his off day.
Me? I almost never eat breakfast and have been unwittingly following an intermittent fasting regimen for many years.
Tags: food videoComic: The RegurgiStation
Tracking down a performance hit
I’ve been following the progress of .NET Core with a lot of interest, and trying to make the Noda Time master branch keep up with it. The aim is that when Noda Time 2.0 eventually ships (apologies for the delays…) it will be compatible with .NET Core from the start. (I’d expected to be able to support netstandard1.0
, but that appears to have too much missing from it. It looks like netstandard1.3
will be the actual target.)
I’ve been particularly looking forward to being able to run the Noda Time benchmarks (now using BenchmarkDotNet) to compare .NET Core on Linux with the same code on Windows. In order to make that a fair comparison, I now have two Intel NUCs, both sporting an i5-5250U and 8GB of memory.
As it happens, I haven’t got as far as running the benchmarks under .NET Core – but I am now able to run all the unit tests on both Linux and Windows, using both the net451
TFM and netcoreapp1.0
.
When I did that recently, I was pretty shocked to see that (depending on which tests I ran) the tests were 6-10 times slower on Linux than on Windows, using netcoreapp1.0
in both cases. This post is a brief log of what I did to track down the problem.
Step 1: Check that there’s really a problem
Thought: Is this actually just a matter of not running the tests in a release configuration, or something similar?
Verification: I ran the tests several times, specifying -c Release
on the command line to use the release build of both NodaTime.Test.dll
and NodaTime.dll
. Running under a debugger definitely wasn’t an issue, as this was all just done from the shell.
Additionally, I ran the tests in two ways – firstly, running the whole test suite, and secondly running with --where=cat!=Slow
to avoid the few tests I’ve got which are known to be really pretty slow. They’re typically tests which compare the answers the BCL gives with the answers Noda Time gives, across the whole of history for a particular calendar system or time zone. I’m pleased to report that the bottleneck in these tests is almost always the BCL, but that doesn’t help to speed them up. If only the “slow” tests had been much slower on Linux, that might have pointed to the problems being in BCL calendar or time zone code.
The ratios vary, but there was enough of a problem under both circumstances for it to be worth looking further.
Step 2: Find a problematic test
I didn’t have very strong expectations one way or another about whether this would come down to some general problem in the JIT on Linux, or whether there might be one piece of code causing problems in some tests but not others. Knowing that there are significant differences in handling of some culture and time zone code between the Linux and Windows implementations, I wanted to find a test which used the BCL as little as possible – but which was also slow enough for the differences in timing to be pronounced and not easily explicable by the problems of measuring small amounts of time.
Fortunately, NUnit produces a TestResult.xml
file which is easy to parse with LINQ to XML, so I could easily transform the results from Windows and Linux into a list of tests, ordered by duration (descending), and spot the right kind of test.
I found my answer in UmAlQuraYearMonthDayCalculatorTest.GetYearMonthDay_DaysSinceEpoch
, which effectively tests the Um Al Qura calendar for self consistency, by iterating over every day in the supported time period and checking that we can convert from “days since Unix epoch” to an expected “year, month day”. In particular, this test doesn’t rely on the Windows implementation of the calendar, nor does it use any time zones, cultures or anything similar. It’s nicely self-contained.
This test took 2051ms on Linux and 295ms on Windows. It’s possible that those figures were from a debug build, but I repeated the tests using a release build and confirmed that the difference was still similar.
Step 3: Find the bottleneck
At this point, my aim was to try to remove bits of the test at a time, until the difference went away. I expected to find something quite obscure causing the difference – something like different CPU cache behaviour. I knew that the next step would be to isolate the problem to a small piece of code, but I expected that it would involve a reasonable chunk of Noda Time – at least a few types.
I was really lucky here – the first and most obvious call to remove made a big difference: the equality assertion. Assertions are usually the first thing to remove in tests, because everything else typically builds something that you use in the assertions… if you’re making a call without either using the result later or asserting something about the result, presumably you’re only interested in side effects.
As soon as I removed the call to Assert.AreEqual(expected, actual)
, the execution time dropped massively on Linux, but hardly moved on Windows: they were effectively on a par.
I wondered whether the problem was with the fact that I was asserting equality between custom structs, and so tried replacing the real assertions with assertions of equality of strings, then of integers. No significant difference – they all showed the same discrepancy between Windows and Linux.
Step 4: Remove Noda Time
Once I’d identified the assertions as the cause of the problem, it was trivial to start a new test project with no dependency on Noda Time, consisting of a test like this:
[Test] public void Foo() { for (int i = 0; i < 1000000; i++) { var x = 10; var y = 10; Assert.AreEqual(x, y); } }
This still demonstrated the problem consistently, and allowed simpler experimentation with different assertions.
Step 5: Dig into NUnit
For once in my life, I was glad that a lot of implementation details of a framework were exposed publicly. I was able to try lots of different “bits” of asserting equality, in order to pin down the problem. Things I tried:
-
Assert.AreEqual(x, y)
: slow -
Assert.That(x, Is.EqualTo(y))
: slow - Constructing an
NUnitEqualityComparer
: fast - Calling
NUnitEqualityComparer.AreEqual
: fast. (Here the construction occurred before the loop, and the comparisons were in the loop.) - Calling
Is.EqualTo(y)
: slow
The last bullets two bullets were surprising. I’d been tipped off that NUnitEqualityComparer
uses reflection, which could easily differ in performance between Windows and Linux… but checking for equality seemed to be fast, and just constructing the constraint was slow. In poking around the NUnit source code (thank goodness for Open Source!) it’s obvious why Assert.AreEqual(x, y)
and Assert.That(y, Is.EqualTo(x))
behave the same way – the former just calls the latter.
So, why is Is.EqualTo(y)
slow (on Linux)? The method itself is simple – it just creates an instance of EqualConstraint
. The EqualConstraint
constructor body doesn’t do much… so I proved that it’s not EqualConstraint
causing the problem by deriving my own constraint with a no-op implementation of ApplyTo
… sure enough, just constructing that is slow.
That leaves the constructor of the Constraint
abstract base class:
protected Constraint(params object[] args) { Arguments = args; DisplayName = this.GetType().Name; if (DisplayName.EndsWith("`1") || DisplayName.EndsWith("`2")) DisplayName = DisplayName.Substring(0, DisplayName.Length - 2); if (DisplayName.EndsWith("Constraint")) DisplayName = DisplayName.Substring(0, DisplayName.Length - 10); }
That looks innocuous enough… but maybe calling GetType().Name
is expensive on Linux. So test that… nope, it’s fast.
At this point I’m beginning to wonder whether we’ll ever get to the bottom of it, but let’s just try…
[Test] public void EndsWith() { string text = "abcdefg"; for (int i = 0; i < Iterations; i++) { text.EndsWith("123"); } }
… and sure enough, it’s fast on Windows and slow on Linux. Wow. Looks like we have a culprit.
Step 6: Remove NUnit
At this point, it’s relatively plain sailing. We can reproduce the issue in a simple console app. I won’t list the code here, but it’s in the GitHub issue. It just times calling EndsWith
once (to get it JIT compiled) and then a million times. Is it the most rigorous benchmark in the world? Absolutely not… but when the difference is between 5.3s on Linux and 0.16s on Windows, on the same hardware, I’m not worried about inaccuracy of a few milliseconds here or there.
Step 7: File a CoreCLR issue
So, as I’ve shown, I filed a bug on GitHub. I’d like to think it was a pretty good bug report:
- Details of the environment
- Short but complete console app ready to copy/paste/compile/run
- Results
Exactly the kind of thing I’d have put into a Stack Overflow question – when I ask for a minimal, complete example on Stack Overflow, this is what I mean.
Anyway, about 20 minutes later (!!!), Stephen Toub has basically worked out the nub of it: it’s a culture issue. Initially, he couldn’t reproduce it – he saw the same results on Windows and Linux. But changing his culture to en-GB, he saw what I was seeing. I then confirmed the opposite – when I ran the code having set LANG=en-US
, the problem went away for me. Stephen pulled Matt Ellis in, who gave more details as to what was going wrong behind the scenes.
Step 8: File an NUnit issue
Matt Ellis suggested filing an issue against NUnit, as there’s no reason this code should be culture-sensitive. By specifying the string comparison as Ordinal
, we can go through an even faster path than using the US culture. So
if (DisplayName.EndsWith("Constraint"))
becomes
if (DisplayName.EndsWith("Constraint", StringComparison.Ordinal))
… and the equivalent for the other two calls.
I pointed out in the issue that it was also a little bit odd that this was being worked out in every Constraint
constructor call, when of course it’s going to give the same result for every instance of the same type. When “every Constraint
constructor call” becomes “every assertion in an entire test run”, it’s a pretty performance-critical piece of code. While unit tests aren’t important in terms of performance in the same way that production code is, anything which adds friction is bad news.
Hopefully the NUnit team will apply the simple improvement for the next release, and then the CoreCLR team can attack the tougher underlying problem over time.
Step 9: Blog about it
Open up Stack Edit, start typing: “I’ve been following the progress”… :)
Conclusion
None of the steps I’ve listed here is particularly tricky. Diagnosing problems is often more a matter of determination and being unwilling to admit defeat than cleverness. (I’m not denying that there’s a certain art to being able to find the right seam to split the problem in two, admittedly.)
I hope this has been useful as a “start to finish” example of what a diagnostic session can look and feel like. It wasn’t one physical session, of course – I found bits of time to investigate it over the course of a day or so – but it would have been the same steps either way.
Smug, satisfied smile…
Microsoft Makes Xamarin Free In Visual Studio, Will Open Source Core Xamarin Tech
Read more of this story at Slashdot.
Mid-day news: A Wings-Canucks preview; ESPN rankings and suspension talk
Of Red Wings-related note this afternoon:
The Wings and Canucks have yet to take part in their morning skate, but the Canucks website's Daniel Fung has penned a preview of tonight's game (10 PM EDT on FSD/CBC/97.1 FM):
Keys to the Game
Canucks: The bottom-9 has to produce. Coach Desjardins has admitted his reluctance to reunite the Sedins and Vrbata was because he didn't want to be a one-line team. Vrbata and the twins delivered on Thursday night but the onus is on the rest of the forwards to produce as well so opponents have to worry about something other than shutting down the Sedin line.
Canucks: Build on Thursday's PP success. After bumping their long man-advantage slump against the Capitals, Vancouver's power play figures to get another boost with the return of Weber to the line-up. Weber was tied for second on the team with five power play goals last season and provides a right-shot, one-timer option that has sorely been missing for most of this season.
Red Wings: Get the early jump. Visiting teams have gotten off to strong starts in every game so far this season at Rogers Arena and the Red Wings will want to follow that same formula to success. Considering all three of Detroit's wins this season have come when they've opened the scoring, the opening goal tonight will be a critical one.
VanCityBuzz's Ian Lusher also notes that the Canucks are 0-2-and-2 at home this season...
Continue reading "Mid-day news: A Wings-Canucks preview; ESPN rankings and suspension talk"
OtterBox Symmetry Series Folio Cases Now Available for iPad Air 2 and iPad Mini 3 [iOS Blog]
The Symmetry Series cases for iPad double as both protective cases and multi-position stands with 360-degrees rotation for landscape and portrait viewing. The cases also automatically wake and sleep the iPad screen when the folio is opened and closed, and have certified drop protection in case your iPad takes a tumble.
OtterBox Symmetry Series folio cases for iPad Air 2 and iPad mini 3 can be purchased through the accessory maker's website for $69.95 and $59.95 respectively. OtterBox makes the number-one selling cases in North America and was one of the earlier providers of cases and accessories for iPhone, iPad and other products by both Apple and competitors.
Century Old Antarctic Expedition Notebook Found Underneath Ice
Read more of this story at Slashdot.
Questions For Khan
Q: Do you think the Wings will trade Dylan Larkin for Tyler Myers? I hope not. I like Myers' size and right shot but I hope they don't sacrifice the future for him.
A: My guess is they won't part with Larkin in their efforts to acquire Myers. They're committed to building through the draft. They relinquished a young center (Calle Jarnkrok) last season because they felt they needed a veteran center (David Legwand) to help them reach the playoffs.
They have a couple of other promising centers in the system in Tomas Nosek and Mattias Janmark, but Larkin, their top pick in 2014 (15th overall), by several accounts, has the potential to be much better. It's no surprise the Sabres are seeking him, after being told top prospect Anthony Mantha isn't going anywhere.
The Red Wings will have to relinquish something of value to land the young (24), 6-foot-8, puck-moving defenseman Myers. At 1-7, the Sabres are well on their way to reaching their goal of gaining the most lottery balls for the Connor McDavid draft in June, so they shouldn't be in any hurry to get rid of all their best players.
more questions for Ansar Khan of Mlive including one on Weiss...
How #3DPrinting went from pipe dream to your desktop #3DPrinting #3DThursday
3D printers are all the rage with enthusiasts, but they didn’t just materialize out of nowhere like the sculptures they produce. Here’s the untold story of how the next big boom in technology came to be over 30 years.
read more on:
digitaltrends.com
Every Thursday is #3dthursday here at Adafruit! The DIY 3D printing community has passion and dedication for making solid objects from digital models. Recently, we have noticed electronics projects integrated with 3D printed enclosures, brackets, and sculptures, so each Thursday we celebrate and highlight these bold pioneers!
Have you considered building a 3D project around an Arduino or other microcontroller? How about printing a bracket to mount your Raspberry Pi to the back of your HD monitor? And don’t forget the countless LED projects that are possible when you are modeling your projects in 3D!