DotNetBenchmark and Json.NET custom converter — useful usage

TodlyCodly
4 min readOct 17, 2022

--

Photo by Jay Heike on Unsplash

Some background

Recently I had to serialize data from derived C# classes to json and best fit was Discriminated Union, sadly C# does not support this natively.

Simplest solution is to use base class with Discriminator property and all of derived classes uses different value for Discriminator.

To handle this situation, I needed Custom Converter and Custom Contract Resolver. I don’t know if this is common, but when I am in situation, I need to write:

public class Custom<AnyClassName>
{...}

I feel like:

Here is why:

  1. 9/10 times I need something custom because I did something wrong.
  2. 9/10 times my custom solution is just reinventing the wheel.
  3. 9/10 times my unit tests are not covering cases which are happening in real life.
  4. 9/10 times I worry about performance.

Is my implementation good?

C# is not opinionated language, there are always multiple ways to make something work. Thanks to dotNetBenchmark I can always do some quick A/B testing and verify that at least performance is not very bad.

Code

Imagine you need to serialize and deserialize object which can be one of following:

example classes

Moreover, they could be independent, so I cannot just combine them together. To accomplish pseudo Discriminated Union I can use additional property Discriminator and encode type for further decoding:

This approach is not supported by default by Json.NET so additionally I need:

  1. Deserialize only discriminator to identify which class should be used to deserialize.
  2. Cast discriminator to one of defined types.
  3. Remove custom converter (more in point 6).
  4. Cast to final type.
  5. Make this converter only triggered for Read operation and use default for Write.
  6. If we just, try to deserialize to final type we would trigger once again the same Converter and recursion loop will start. For derived type we ask Json.NET to forget about custom converter and just use default one.

Can you spot where is a bug?

Setup of Benchmark

I don’t want to duplicate dotNetBenchmark Overview page, so going straight to code:

What is important is that I can prepare simple method and add attribute:

[Benchmark]

To run benchmark, I needed to add Console Application to project with this code:

Run it as any other console app (it needs to be run in Release mode, but error message will help to get it right)

Results

I am showing just important part:

PocoSerializeAndDeserialize: is baseline, which use no custom converters at all.

ConverterSerializeAndDeserilize: is code I’ve just wrote. It is 466x times slower. In the end it takes ~1.8 ms, but I can spend this 1.8ms executing other bad code not this one. Simple solution was:

serializer.ContractResolver is using static version of CleanupResolver not building every time new one.

This might seem to be obvious, but without benchmark I would never find out about this.

Conclusion

Benchmarking/optimization is usually introduced when things go bad. Not 1.8ms lag but 1.8sec. You might think “how do I know there is something wrong?”. Well, this is hard to predict. But in my opinion, anything named Custom, is good starting point.

All code can be checked here:

mes1234/Jasoon (github.com)

If you found this article useful, or you want to comment, please comment :)

--

--

TodlyCodly
TodlyCodly

Written by TodlyCodly

C# developer, who once was Pythonista and dreams of being Golang guy.

No responses yet