Platform independent development with .NET

We develop most of our projects as platform independent applications, usually running under Windows, Mac and Linux. There are exceptions, for example when it is required to communicate with special hardware drivers or third-party libraries or other components that are not available on all platforms. But even then we isolate these parts into interchangeable modules that can be operated either in a simulated mode or with the real thing. The simulated modes are platform independent. Developers usually can work on the code base using their favorite operating system. Of course, it has to be tested on the target platform(s) that the application will run on in the end.

Platform independent development is both a matter of technology choices and programming practices. Concerning the technology the ecosystem based on the Java VM is a proven choice for platform independent development. We have developed many projects in Java and other JVM based languages. All of our developers are polyglots and we are able to develop software with a wide variety of programming languages.

The .NET ecosystem

Until recently the .NET platform has been known to be mainly a Microsoft Windows based ecosystem. The Mono project was started by non-Microsoft developers to provide an open source implementation of .NET for other operating systems, but it never had the same status as Microsoft’s official .NET on Windows.

However, recently Microsoft has changed course: They open sourced their .NET implementation and are porting it to other platforms. They acquired Xamarin, the company behind the Mono project, and they are releasing developer tools such as IDEs for non-Windows platforms.

IDEs for non-Windows platforms

If you want to develop a .NET project on a platform other than Windows you now have several choices for an IDE:

I am currently using JetBrains Rider on a Mac to develop a .NET based application in C#. Since I have used other JetBrains products before it feels very familiar. Xamarin Studio, MonoDevelop, VS for Mac and JetBrains Rider all support the solution and project file format of the original Visual Studio for Windows. This means a .NET project can be developed with any of these IDEs.

Web applications

The .NET application I am developing is based on Web technologies. The server side uses the NancyFX web framework, the client side uses React. Persistence is done with Microsoft’s Entity Framework. All the libraries I need for the project like NancyFX, the Entity Framework, a PostgreSQL driver, JSON.NET, NLog, NUnit, etc. work on non-Windows platforms without any problems.

Conclusion

Development of .NET applications is no longer limited to the Windows platform. Microsoft is actively opening up their development platform for other operating systems.

MSBuild Basics

MSBuild is Microsoft’s build system for Visual Studio. Visual Studio project files (*.csproj, *.vbproj) do not only describe the project structure, but are also build scripts for MSBuild. They’re executed when you click the run button in the IDE, but they can also be called via the MSBuild command line utility.

> MSBuild.exe Project.csproj

These project files / build scripts are in XML format, comparable to Ant scripts in the Java land.

Edit project files

You can edit these files in any text editor, of course. But if you want to edit them within Visual Studio, you have to unload the project first:

  • Right click on the project in the Solution Explorer -> Unload Project
  • Right click on the project in the Solution Explorer -> Edit MyProject.csproj

After you’re done editing you can reload the project again via the context menu.

Targets and tasks

The concepts of MSBuild are comparable to many other build systems: a build script contains a set of named targets, and each target consists of a sequence of task calls.

A project can have one or more default targets, referenced by the DefaultTargets attribute of the Project root element:

<Project DefaultTargets="Build" ...>

Multiple targets can be separated by semicolons.

Targets are declared via Target tags containig the task calls:

  <Target Name="Clean">
    <Delete Files="xyz.tmp" />
    ...
  </Target>

MSBuild comes with a set of common tasks, such as Message, Copy, Delete, Exec, …

If you need more tasks you should have a look at these community provided task collections:

Both are available as NuGet packages and can be checked into your code repository alongside the project for self-containment. For the Extension Pack you have to set the ExtensionTasksPath property correctly before importing the tasks, for example:

<PropertyGroup>
  <ExtensionTasksPath Condition="'$(ExtensionTasksPath)' == ''">$(MSBuildProjectDirectory)\packages\MSBuild.Extension.Pack.1.5.0\tools\net40</ExtensionTasksPath>
</PropertyGroup>

<Import Project="$(ExtensionTasksPath)MSBuild.ExtensionPack.tasks">

Properties

Properties are defined within PropertyGroup tags, containing one or many property tags. The names of these tags are the property names and the tag contents are the property values. Properties are referenced via $(PropertyName). A property definition can have an optional Condition attribute, which determines whether a property should be set or not. The condition ‘$(PropertyName)’ == ”, for example, checks if a property is not yet set.

Here’s an example build target that uses the ZIP compression task from the Extension Pack and some properties to create a ZIP file artifact from the build results:

<Target Name="AfterBuild">
  <MSBuild.ExtensionPack.Compression.Zip TaskAction="Create" CompressPath="$(OutputPath)" ZipFileName="bin\$(ProjectName)-$(BuildNumber).zip" />
</Target>

You can also set property values from the outside via the MSBuild call:

> MSBuild.exe /t:Build /p:Configuration=Release;BuildNumber=1234 Project.csproj

  • The /t switch determines which targets to run. Multiple targets can be separated by semicolons.
  • The /p switch sets properties in the form of PropertyName=value, also separated by semicolons.

This way you can pass environment variables like $BUILD_NUMBER from your Continuous Integration system (e.g. Jenkins) to your build script:

> MSBuild.exe /t:Build /p:Configuration=Release;BuildNumber=$BUILD_NUMBER Project.csproj

Now you could use the MSBuild.ExtensionPack.Framework.AssemblyInfo task to write the $(BuildNumber) property into your AssemblyInfo file.

Recap of the Schneide Dev Brunch 2014-08-31

If you couldn’t attend the Schneide Dev Brunch at 31nd of August 2014, here is a summary of the main topics.

brunch64-borderedYesterday, we held another Schneide Dev Brunch, a regular brunch on a sunday, only that all attendees want to talk about software development and various other topics. If you bring a software-related topic along with your food, everyone has something to share. The brunch was well-attended this time but the weather didn’t allow for an outside session. There were lots of topics and chatter. As always, this recapitulation tries to highlight the main topics of the brunch, but cannot reiterate everything that was spoken. If you were there, you probably find this list inconclusive:

Docker – the new (hot) kid in town

Docker is the hottest topic in software commissioning this year. It’s a lightweight virtualization technology, except that you don’t obtain full virtual machines. It’s somewhere between a full virtual machine and a simple chroot (change root). And it’s still not recommended for production usage, but is already in action in this role in many organizations.
We talked about the magic of git and the UnionFS that lay beneath the surface, the ease of migration and disposal and even the relative painlessness to run it on Windows. I can earnestly say that Docker is the technology that everyone will have had a look at before the year is over. We at the Softwareschneiderei run an internal Docker workshop in September to make sure this statement holds true for us.

Git – the genius guy with issues

The discussion changed over to Git, the distributed version control system that supports every versioning scheme you can think of but won’t help you if you entangle yourself in the tripwires of your good intentions. Especially the surrounding tooling was of interest. Our attendees had experience with SmartGit and Sourcetree, both capable of awesome dangerous stuff like partial commmits and excessive branching. We discovered a lot of different work styles with Git and can agree that Git supports them all.
When we mentioned code review tools, we discovered a widespread suspiciousness of heavy-handed approaches like Gerrit. There seems to be an underlying motivational tendency to utilize reviews to foster a culture of command and control. On a technical level, Gerrit probably messes with your branching strategy in a non-pleasant way.

Teamwork – the pathological killer

We had a long and deep discussion about teamwork, liability and conflicts. I cannot reiterate everything, but give a few pointers how the discussion went. There is a common litmus test about shared responsibility – the “hold the line” mindset. Every big problem is a problem of the whole team, not the poor guy that caused it. If your ONOZ lamp lights up and nobody cares because “they didn’t commit anything recently”, you just learned something about your team.
Conflicts are inevitable in every group of people larger than one. We talked about team dynamics and how most conflicts grow over long periods only to erupt in a sudden and painful way. We worked out that most people aren’t aware of their own behaviour and cannot act “better”, even if they were. We learned about the technique of self-distancing to gain insights about one’s own feelings and emotional drive. Two books got mentioned that may support this area: “How to Cure a Fanatic” by Amos Oz and “On Liberty” from John Stuart Mill. Just a disclaimer: the discussion was long and the books most likely don’t match the few headlines mentioned here exactly.

Code Contracts – the potential love affair

An observation of one attendee was a starting point for the next topic: (unit) tests as a mean for spot checks don’t exactly lead to the goal of full confidence over the code. The explicit declaration of invariants and subsequent verification of those invariants seem to be more likely to fulfil the confidence-giving role.
Turns out, another attendee just happened to be part of a discussion on “next generation verification tools” and invariant checking frameworks were one major topic. Especially the library Code Contracts from Microsoft showed impressive potential to really be beneficial in a day-to-day setting. Neat features like continuous verification in the IDE and automatic (smart) correction proposals makes this approach really stand out. This video and this live presentation will provide more information.

While this works well in the “easy” area of VM-based languages like C#, the classical C/C++ ecosystem proves to be a tougher nut to crack. The common approach is to limit the scope of the tools to the area covered by LLVM, a widespread intermediate representation of source code.

Somehow, we came across the book titles “The Economics of Software Quality” by Capers Jones, which provides a treasure of statistical evidence about what might work in software development (or not). Another relatively new and controversial book is “Agile! The Good, the Hype and the Ugly” from Bertrand Meyer. We are looking forward to discuss them in future brunches.

Visual Studio – the merchant nobody likes but everybody visits

One attendee asked about realistic alternatives to Visual Studio for C++ development. Turns out, there aren’t many, at least not free of charge. Most editors and IDEs aren’t particularly bad, but lack the “everything already in the box” effect that Visual Studio provides for Windows-/Microsoft-only development. The main favorites were Sublime Text with clang plugin, Orwell Dev-C++ (the fork from Bloodshed C++), Eclipse CDT (if the code assist failure isn’t important), Code::Blocks and Codelite. Of course, the classics like vim or emacs (with highly personalized plugins and setup) were mentioned, too. KDevelop and XCode were non-Windows platform-based alternatives, too.

Stinky Board – the nerdy doormat

One attendee experiments with input devices that might improve the interaction with computers. The Stinky Board is a foot-controlled device with four switches that act like additional keys. In comparison to other foot switches, it’s very sturdy. The main use case from our attendee are keys that you need to keep pressed for their effect, like “sprint” or “track enemy” in computer games. In a work scenario, there are fewer of these situations. The additional buttons may serve for actions that are needed relatively infrequently, but regularly – like “run project”.

This presentation produced a lot of new suggestions, like the Bragi smart headphones, which include sensors for head gestures. Imagine you shaking your head for “undo change” or nod for “run tests” – while listening to your fanciest tunes (you might want to refrain from headbanging then). A very interesting attempt to combine mouse, keyboard and joystick is the “King’s Assembly“, a weird two-piece device that’s just too cool not to mention. We are looking forward to hear more from it.

Epilogue

As usual, the Dev Brunch contained a lot more chatter and talk than listed here. The high number of attendees makes for an unique experience every time. We are looking forward to the next Dev Brunch at the Softwareschneiderei. And as always, we are open for guests and future regulars. Just drop us a notice and we’ll invite you over next time.