Author Archives: William Roush

About William Roush

William Roush is currently employed as a Senior Software Developer and independent contractor in Chattanooga, Tennessee. He has more than 12 years of experience in the IT field, with a wide range of exposure including software development, deploying and maintaining virtual infrastructure, storage administration and Windows administration.

RhodeCode Session Storage Growing Out Of Control

Written by William Roush on October 11, 2016 at 6:13 pm

Had a fun run in with RhodeCode recently, it’s session storage located at /home/rhodecode/.rccontrol/community-1/data/sessions was eating up 401981 inodes! Eek!

This is caused by the built-in default session manager lib not cleaning up old files, I had stumbled across this blog detailing how to “fix” it: http://it-spir.it/blog/2012-03-31-rhodecode-remove-outdated-session-data.html and I ended up with this in my crontab:

0 1 * * * find /home/rhodecode/.rccontrol/community-1/data/sessions -type f -mtime +3 -exec rm {} \;

Alas this wasn’t really what I wanted, so I asked on the RhodeCode-Community Slack channel fix is to use something like the database for session storage as seen here: https://docs.rhodecode.com/RhodeCode-Enterprise/admin/tuning-increase-db-performance.html

 

A quick enabling of beaker’s database session storage and I was all better!

 

server.urls Paramter Not Working On Kestrel Server for dotnet Command on ASP.NET Core RC2

Written by William Roush on May 21, 2016 at 3:53 am

Was struggling really hard with getting this command to work:


dotnet run -- --server.urls http://*:5000

Simple command, but it isn’t taking! Well I missed one major thing when upgrading from RC1 to RC2, I need to now pass configuration down to the Kestrel stack, to do that is pretty easy:

        public static void Main(string[] args)
        {
            var config = new ConfigurationBuilder()
                .AddCommandLine(args)
                .AddEnvironmentVariables(prefix: "ASPNETCORE_")
                .Build();

            var host = new WebHostBuilder()
                .UseConfiguration(config)
                .UseKestrel()
                .UseContentRoot(Directory.GetCurrentDirectory())
                .UseIISIntegration()
                .UseStartup<Startup>()
                .Build();

            host.Run();
        }

Needed to define the “config” variable and pass it to WebHostBuilder using UseConfiguration(). All fixed!

.NET Core And Why I Think it’s Huge!

Written by William Roush on April 18, 2016 at 10:20 pm

.NET Core is not just a large change for how you write code, but for the entire .NET ecosystem. I’m excited about this change from every front and I’m going to tell you why!

What is .NET Core?

.NET Core (previously known as .NET 5.0) is the new streamlined .NET platform by Microsoft. .NET Core takes a lot of lessons learned from pain points on the older .NET 1.0-4.6 platform and fixes a massive amount of it, giving .NET a new breath of life and hoping to make this platform beneficial for all. It appears to not only take lessons from what didn’t work on .NET’s platform, but what did work on other platforms.

What does that mean for developers?

Well here is the painful part: .NET Core is different, very different, for cross-platform support you’ll have to keep an eye out on what is .NET Core and what is .NET 4.5, thankfully Intellisense does a great job of letting you know that:

Watch out for missing implementations between frameworks.

Watch out for missing implementations between frameworks.

.NET Core is designed to rely on Nuget heavily though and have you pick and plug 3rd party libraries, designed around smaller assemblies to keep down on application size bloat and allowing you to pick and choose. This is seen with the new Entity Framework Core libraries and how they give you a small slice of functionality (data abstraction) and the individual engines are kept as separate libraries (which is common for ORMs on say, Node.js).

The shift into heavily using dependency injection as part of normal application process is wonderful for decoupling your code and having it easily testable, everything is right there ready to go from the start.

Configuration files being JSON is a huge step up, ASP.NET integrating Gulp as part of it’s build process for javascript libraries for both build and client side (they rely on Bower for client side scripts). The new project.json file allows for easy extension of your application’s behavior. If you’re used to .NET’s old way of doing things you’ll start with “This doesn’t work anymore? Ugh… how do you do that…. oh… that’s much better!”

What does that mean for platform support?

.Net Core Builds

It means that we’re seeing .NET on more platforms that we’re used to as a first class citizen from Microsoft, this means huge things (I’ll touch on those later) for the popularity of .NET. Microsoft has even been quoted saying this:

When used third-party tools such as Xamarin, .NET Core should be portable to IOS and Android devices.

And with Microsoft’s recent acquirement of Xamarin and pushing it as free, we’re probably going to see .NET Core as a solid mobile platform in the near future.

What does this mean for the .NET Community?

A new .NET developer said this to me recently:

Not much of a supportive community around .NET, at least nothing like other languages I have learned.

And I can’t help but agree, while I appreciate our local .NET Users Group here in Chattanooga (which has been pretty quiet for awhile), I find the subjects of our local Developers Group more interesting. While .NET generally gets paid decently and seems to be in a decent amount of demand here, it seems to be more of the slower older corporate companies that are hiring and less smaller agile companies.

I think getting .NET on Linux platforms and putting out solid community editions of Visual Studio may breathe new life into the community. Bringing fresh young blood, new ideas and more open source software, maybe we’ll go back to solving hard problems too. It’s a huge weak spot in .NET and really hurts adoption, which I think really hurts the available libraries and third party tools (while Microsoft provides some good tools, it still lacks heavily on 3rd party tooling that isn’t just babysitting people who can’t write basic menus… where is my Code Climate for .NET?!).

I’ve been wanting to write an article on the abysmal state of the .NET community, it’s developers, the skill sets of a lot of them and what kind of code they churn out, this is an excellent opportunity to do so.

What does this mean for businesses?

Well if you’re a startup, maybe in a few years .NET on Linux will be a hot thing. C# 6.0 is a wonderful language and with every iteration it gets more powerful (usually stealing from F#). Getting MSSQL on Linux seems to be the first in many steps for Microsoft to not just push it’s operating system by locking it’s solutions onto it, but push solutions that you’d be happy to pay for due to their ease of use or specific desired features (which sometimes for some companies may be completely worth it).

If you’ve been knee-deep in .NET for years it can mean a few things: .NET Core’s performance is insane, which can mean a reduction in required resources for your applications. The build process is a lot easier to keep consistent and extend so less work to automate it. Applications being shipped with their runtimes means less effort determine what runtimes are installed and handling them. It is however a large shift and I would suggest that a lot of .NET developers aren’t ready for all this change (more on the community later…).

Of course you can completely ignore it and mostly not deal with it for awhile, .NET 4.5 isn’t going away anytime soon.

Overall…

I’m excited to no end about this, first was the fixing of all the problems we’ve had, but secondly because this may really help the .NET community flourish. Fun new times in the land of .NET.

Backing Up Snapshots For VMware

Written by William Roush on April 17, 2016 at 8:01 pm

Sometimes backing up snapshots is useful, lots of applications don’t do it out of the box… so how are we going to accomplish this?

Why Backup Snapshots?

While everyone would jump on board the “snapshots aren’t backup” train (which I’m a proud member of) there are reasons you may want to backup snapshots. One of the biggest reasons I would is that certain tools like TeamCity can leverage snapshots as checkpoints to boot up your build agents from, and when you go to restore your virtual machines it would be nice to not have to recreate snapshots.

I’m sure there are many other perfectly valid reasons to need long-lived snapshots (especially on non-production development/testing machines), so why not support that recovery mode?

The Problem

As far as I’ve seen, all backup software squashes your snapshots, restoring a virtual machine results in you having a single state, all snapshots are erased. Ouch! We’ll need to get clever.

Doing It With Veeam

This isn’t the best way to do this, but so far it seems to be the easiest. File copy jobs. We want the entire state of the virtual machine, all of it’s files, it’s snapshot descriptor files, everything. Now there are a few downsides to this:

  • Your backups will be larger – Veeam only backs up the current state, we will be backing up all states. You need to store all the snapshot deltas and any memory snapshots.
  • File lock issues – We’ll need to resolve issues with backing up a powered-on virtual machine
  • Storage – We don’t get clean vbk files, we’ll have full copies of whatever exists on the datastore
  • Versioning – If we want to keep multiple copies over time, we’ll probably want to automate versioning our backup folders folders.
  • Tape – We lose some visibility pushing backup files to tape instead of the vbks (though this only applies to higher licensing tiers that can leverage that).

Doing it With Powered Off Virtual Machines

Easy enough, we want to pick the folder that the virtual machine is in on the datastore, and back it up. Everything should go smoothly and file locks shouldn’t bite us.

Doing it With Powered On Virtual Machines

This becomes extremely tricky, you need to back up only the unlocked files, this also means that the current state will be trashed (if this isn’t OK, we can automate a NEW snap prior to the job running and commit it on completion). Here are a list of what I’m backing up to test this:

  • [VM]-000001.vmdk – Our VMDK for our current state (this is bad due to locked file)
  • [VM]-aux.xml
  • [VM].vmx – Virtual machine configuration file
  • [VM]-ctk.vmdk
  • [VM]-Snapshot15.vmsn
  • [VM]-000001-ctk.vmdk
  • [VM].vmdk – Our base VMDK metadata, our snapshot
  • [VM].vmsd
  • [VM].nvram
  • [VM]-flat.vmdk – Our base VMDK with our data on it, our snapshot

These files were locked:

  • [VM]-000001-delta.vmdk – Our delta file for our current state (after our snap)
  • [VM]*.lck – Anything with a “lck” extension appeared locked.

When restoring the file we’ll create an invalid delta file to put things into a somewhat OK state, SSH into you hypervisor, navigate to your virtual machine’s directory and type this (replace “00001” with the number of the delta you’re missing):


touch  [VM]-000001-delta.vmdk

This will create an empty VMDK delta file, it’s invalid and your machine will not boot, but from this stage you can revert back to the last snapshot, setting everything in a correct state.

The easiest way to do this would be just to add all files to the file copy job, and let those that are locked fail, a script will handle this best being as Veeam’s UI will not let you multi-select files, and selecting the folder results in a failure of the entire backup on the first locked file.

You miss out on a lot of nice to haves, restoring this involves copying the files back to the datastore (can be done with a file copy job in the reverse direction) and adding the machine to your inventory manually, but you cannot restore to a newly named virtual machine, you’ll have to restore as-is and rename after it’s done. Be aware too: transfer speeds seemed to suffer a lot for this kind of backup setup.

Additionally this has worked under lab conditions, so please, as with any backup test it first! Let me know if it works for you.

 

If there is enough interest maybe I’ll write up some PowerShell scripts to automate some of the more tricky stuff and post it.