Composing functions and leveraging scala.util.Try

Last couple of days I have been hacking around trying to find a cute way to express the intent of my code. Typically it involves parsing some input, validating whatever rules that are in charge and persisting some values. In scala one can compose such a function by using the andThen method. Here is a concrete example:

[code lang=”scala”]
def id(x: String) = x
def parseInput = id _
def validate = id _
def persist = id _

def usecase1 = parseInput andThen validate andThen persist

Inspired by the excellent series on Railway Oriented Programming series by Scott Wlaschin I wanted to take advange of scala.util.Try to remove try/catch clutter from my code. With a little helper function I can now compose my usecase as following:

[code lang=”scala”]
def makeTry[TIn, TOut](fn: TIn => TOut) = (x: TIn) => Try(fn(x))

def usecase =
makeTry(parseInput andThen validate andThen persist) andThen
processErrors andThen
proceedOrRetry
[/code]

Instead of composing functions I could have also written code as a chain of values that are transformed by subsequent functions as following (Very much fsharp like):

[code lang=”scala”]
class Pipe[T](val x: T) {
def |> [U](f: T => U) = f(x)
}

implicit def toPipe[T](x: T) = new Pipe(x)

def usecase(x: String) = Try(x |> parseInput |> validate |> persist) |>
processErrors |>
proceedOrRetry
[/code]

As you can see, with scala there is more than one way to express something in an elegant way! 😉

Using Gson to serialize Scala objects

gson is a pretty nice library that converts Java objects into JSON and back. When using this library from Scala things become a bit harder (eg: Plenty of people have difficulties when their scala object has an (im)mutable Map or List).

Here is an example to convert a JSON object to a Map[String,String]:

[code lang=”scala”]
import com.google.gson.Gson
import scala.collection.JavaConversions._

val mapJson = "{ ‘a’: ‘b’, ‘c’: ‘d’ }"
val map = new Gson().fromJson(mapJson, classOf[java.util.Map[String, String]])
[/code]

Now that we know that this works, we hide the java types in the constructor and expose a nicer scala type via a method:

[code lang=”scala”]
case class Dummy(private val settings: java.util.Map[String, String]) {
def getSettings = settings.toMap
}

val dummyJson = "{ ‘settings’ : { ‘a’: ‘b’, ‘c’: ‘d’ } }"
val dummy = new Gson().fromJson(dummyJson, classOf[Dummy])

case class Dummy2(private val options: java.util.List[String]) {
def getOptions = options.toList
}

val dummy2Json = "{ ‘options’ : [ ‘a’, ‘b’, ‘c’, ‘d’ ] }"
val dummy2 = new Gson().fromJson(dummy2Json, classOf[Dummy2])

[/code]

Edit: One could simply use lift-json instead and get pretty good scala support for free.

Notes on running spark-notebook

These days Docker makes it extremely easy to get started with virtually any application you like. At first I was a bit skeptical but over the last couple of months I have changed my mind. Now I strongly believe this is a game changer. Even more when it comes to Windows. Anyway, these days kitematic (GUI to manage docker images) allows you to simply pick the spark-notebook by Andy Petrella.

docker_pick_image

When running your docker host in VirtualBox, you still need to set up port forwarding for port 9000 (the notebook) and ports 4040 to 4050 (spark-ui) Assuming your docker host vm is named default:

VBoxManage modifyvm "default" --natpf1 "tcp-port9000,tcp,,9000,,9000"
for i in {4040..4050}; do
VBoxManage modifyvm "default" --natpf1 "tcp-port$i,tcp,,$i,,$i";
done

Now you can browse to http://localhost:9000 and start using your new notebook:

spark_notebook_home

You may want to copy the default set of notebooks to a local directory:

docker cp $containerName:/opt/docker/notebooks /Users/timvw/notebooks

Using that local copy is just a few clicks away with Kitematic:

docker_notebook_settings

Offcourse you want to use additional packages such as spark-csv. This can be achieved by editting the your notebook metadata:

spark_notebook_metadata

You simply need to add an entry to customDeps:

spark_notebook_customdeps

When your container did not shutdown correctly, you may end up in the awkward situation that your container believes that it is still running(). The following commands fix that:

docker start $containerName && docker exec -t -i $containerName /bin/rm /opt/docker/RUNNING_PID

ConcurrencyCheck with EF/Devart on Oracle

Earlier this week I was wondering how I could easily achieve optimistic concurrency in a system using EF/Devart targetting an Oracle database (Not really my preferred technologies, but whatever… :P). Here is a potential solution:

Using a column for optimistic concurrency is documented on the devart website:

[code lang=”csharp”]
[Table("TEST")]
public class Test : IRequireConcurrencyCheck
{
..
[Required]
[Column("VERSION")]
[ConcurrencyCheck] // <– TELL EF to use this column as our "timestamp/logical version"
public virtual int Version { get; protected set; } // protected, so users of this type can not touch this (easily)
}
[/code]

By introducing an interface and some custom behaviour on SaveChanges we can now take away the burden of having to update the Version property correctly:

[code lang=”csharp”]
public interface IRequireConcurrencyCheck
{
int Version { get; }
}
[/code]

[code lang=”csharp”]
public class DataContext : DbContext
{
public DataContext(DbConnection existingConnection)
: base(existingConnection, true)
{
Database.SetInitializer<DataContext>(null);
}

public override int SaveChanges()
{
var = ChangeTracker
.Entries<IRequireConcurrencyCheck>()
.Where(x => x.State == EntityState.Modified)
.ToArray();

foreach (var entity in entitiesWhichHaveConcurrencyCheck)
{
entity.Property<int>(x => x.Version).CurrentValue++;
}

return base.SaveChanges();
}

public IDbSet<Test> Tests { get; set; }
}
[/code]

Failure to load mono-supplied .dylib (libgdiplus.dylib) when running from console

So earlier this week I was bit by the following bug: Bug 22140 – Failure to load mono-supplied .dylib when running from console.

The workaround that works for me is the following: Edit /Library/Frameworks/Mono.framework/Versions/3.8.0/etc/mono/config and
replace the entries for libgdiplus:

[code language=”xml”]
<dllmap dll="gdiplus"
target="/Library/Frameworks/Mono.framework/Versions/3.8.0/lib/libgdiplus.dylib"
os="!windows"/>
<dllmap dll="gdiplus.dll"
target="/Library/Frameworks/Mono.framework/Versions/3.8.0/lib/libgdiplus.dylib"
os="!windows"/>
<dllmap dll="gdi32"
target="/Library/Frameworks/Mono.framework/Versions/3.8.0/lib/libgdiplus.dylib"
os="!windows"/>
<dllmap dll="gdi32.dll"
target="/Library/Frameworks/Mono.framework/Versions/3.8.0/lib/libgdiplus.dylib"
os="!windows"/>
[/code]

Deploying a Cloud Service to Azure with Octopus

Currently Octopus has limited support to deploy a Cloud Service on Azure. A typical use-case is that you need a different Web.Config file per environment. Simply add the Web.Environment.Config files to your NuGet package and use the following PreDeploy.ps1 script:

[code language=”powershell”]
# Load unzip support
[Reflection.Assembly]::LoadWithPartialName("System.IO.Compression.FileSystem") | Out-Null

function Unzip($zipFile, $destination)
{
If (Test-Path $destination){
Remove-Item $destination -Recurse | Out-Null
}
New-Item -ItemType directory -Force -Path $destination | Out-Null
[System.IO.Compression.ZipFile]::ExtractToDirectory($zipFile, $destination) | Out-Null
}

# Unzip deployment package
$CsPkg = "Customer.Project.Api.Azure.cspkg"
Unzip $CsPkg "azurePackage"
Unzip (Get-Item (join-path -path "azurePackage" -childPath "*.cssx")) "website"

# Perform replacements, eg: replace Web.Config
$ConfigFileToUse = "Web." + $OctopusParameters["Octopus.Environment.Name"] + ".config"
Copy-Item -Path $ConfigFileToUse -Destination "website/sitesroot/0/Web.Config" -Force

# Repackage
$role = "Customer.Project.Api"
$contentPath = "website\approot"
$rolePath = "website/approot"
$webPath = "website/sitesroot/0"
$cspackPath = "C:\Program Files\Microsoft SDKs\Windows Azure\.NET SDK\v2.2\bin\cspack.exe"
& $cspackPath "ServiceDefinition.csdef" "/out:$CsPkg" "/role:$role;$rolePath;Customer.Project.Api.dll" "/sites:$role;Web;$webPath" "/sitePhysicalDirectories:$role;Web;$webPath"
[/code]

Cute sort implementation

For years I had been implementing my sort functions as following:

[code lang=”csharp”](x,y) => {
if (x.PartName == null && y.PartName == null) return 0;
if (x.PartName == null) return -1;
if (y.PartName == null) return 1;
return x.PartName.CompareTo(y.PartName);
}[/code]

Earlier today I found the following cute variant while browsing through the ServiceStack codebase:

[code lang=”csharp”]
(x,y) => x.Priority – y.Priority
[/code]

Clone all your repositories on another machine

Recently I was configuring a new machine (God, i love Chocolatey) and I wanted to take all the repositories I have under c:/src and clone them on my new machine. Here is how i did that:

[code lang=”bash”]
# write all remote fetch locations into repositories.txt
find /c/src -type d -mindepth 1 -maxdepth 1 -exec git –work-tree={} –git-dir={}/.git remote -v \; | grep fetch | awk ‘{print $2}’ > repositories.txt

# clone each repository
cat repositories.txt | xargs -l1 git clone
[/code]

Or as a gist: https://gist.github.com/timvw/11208834.