JSON Import In .NET

by Anton 22. July 2011 13:37

Bantam Is Quitting Services We as teamaton were using bantam for all of our todos. At the beginning of this year bantam was bought by ConstantContact, and they announced that bantam will cease services as of July 1. Since we are developing our own todo management tool (see our blog), we decided to push the development and use it instead of bantam. Of course we wanted to take all of our todos with us. We used bantams export feature which gave us a JSON-file with all our tasks (closed and open ones). So I took on the task to write an JSON import feature into our tool. Json.NET After a bit of researching, I found that the library Json.NET would suit our import needs perfectly. Applying the deserialization was pretty straightforward – the documentation helped a lot. Here is the code from the Import controller: [HttpPost] public ActionResult Import(HttpPostedFileBase file) { var todos = new List<Todo>(); if (file != null && file.ContentLength > 0) { var streamReader = new StreamReader(file.InputStream); string text = streamReader.ReadToEnd(); streamReader.Close(); var bantamTodos = JsonConvert.DeserializeObject<IList<BantamTodo>>(text) as List<BantamTodo>; todos = bantamTodos.Select(bantamTodo => bantamTodo.ConvertToTodo()).ToList(); _todoRepository.SaveImport(todos); } return RedirectToAction("List"); } It just opens the file, extracts the content as a string, deserializes the string into a list of bantam todos, and then converts these bantam todos into our “normal” todos. Indirection Via BantamTodo-Class As you can see, I did not convert the JSON directly into our Todo-class. You can use attributes and the converter class to deserialize JSON into a class of your liking. There are two reasons, why I did not choose to do so: I did not want to load the Todo-class with attributes and converters, and I thought it would be easier to introduce a middle class (BantamTodo), which poses as a container and converter. I used a nice tool, to take a good look into the original JSON-file: JSON Viewer. With the information about the structure of the JSON file I started implementing via the TDD pattern. Here is my test class, which tests the deserialization of the the bantam todos and the conversion from the class BantamTodo to Todo: [Test] public void Should_Import_BantamToDo_FromJson() { var jsonToDo = ArrangeTaskAsJson(); var bantamToDo = JsonConvert.DeserializeObject<BantamTodo>(jsonToDo); bantamToDo.Categoy.Should().Be.EqualTo("Organisation"); bantamToDo.Complete.Should().Be.EqualTo(true); bantamToDo.Created_At.Should().Be.EqualTo(new DateTime(2011, 6, 30, 0, 41, 57)); bantamToDo.Due.Should().Be.EqualTo(new DateTime(2011, 7, 1)); bantamToDo.Author.Name.Should().Be.EqualTo("Anton"); bantamToDo.Assigned_To.Name.Should().Be.EqualTo("Oliver"); bantamToDo.Related_To[0].Name.Should().Be.EqualTo("ToDo-Management Tool"); bantamToDo.Name.Should().Be.EqualTo("Entwicklung nach Gebieten Personen zuordnen - Verantwortliche, Blogs, etc."); bantamToDo.Description.Should().Be.EqualTo("some good description"); bantamToDo.Flagged.Should().Be.EqualTo(true); } [Test] public void Should_Convert_BantamToDo_ToTodo() { var jsonToDo = ArrangeTaskAsJson(); var bantamToDo = JsonConvert.DeserializeObject<BantamTodo>(jsonToDo); var todo = bantamToDo.ConvertToTodo(); todo.Status.Should().Be.EqualTo(bantamToDo.Complete ? Status.Closed : Status.Open); todo.Description.Should().Contain(bantamToDo.Name); todo.Description.Should().Contain(bantamToDo.Description); todo.Tags.Select(t => t.Name).Should().Contain(bantamToDo.Categoy); foreach (var bantamProject in bantamToDo.Related_To) todo.Tags.Select(t => t.Name).Should().Contain(bantamProject.Name); todo.DateCreated.Should().Be.EqualTo(bantamToDo.Created_At); todo.DateCompleted.Value.Date.Should().Be.EqualTo(bantamToDo.Due); todo.DateDue.Should().Be.EqualTo(bantamToDo.Due); todo.Creator.Name.Should().Be.EqualTo(bantamToDo.Author.Name); todo.Assignee.Name.Should().Be.EqualTo(bantamToDo.Assigned_To.Name); todo.Priority.Value.Should().Be.EqualTo(bantamToDo.Flagged ? 2 : 0); } The implementation was pretty straightforward. Since it was my first time working with MVC, and also my first time working with JSON, it took me some time. All in all – research, export and meetings included – it took me about 12 hours. If you have any suggestions as to improvement I would appreciate them. If you are trying to import JSON into .NET yourself, I hope that this article helps.

Creating a new module in ‘discoverize’ – using multi-file templates and good ol’ batch scripts

by Oliver 15. July 2011 09:07

For our portal software discoverize I was looking for a way to create new modules faster and more reliably. The basic structure would always be the same, so a Visual Studio multi-file template seemed appropriate: Well, unfortunately I didn’t find a way to create new folders with that approach. Multi-file templates really do what they say: they create multiple files from templates. Nothing else. So I put together a short batch script that would create the directory structure needed for any new module: I can quickly open a new command line window by using any one of several Visual Studio extensions (e.g. PowerCommands for Visual Studio 2010): … and simply run: Now going back to Visual Studio we have to include the new Feature folder in the project: Then hit Ctrl + Shift + A to open the Add New Item dialog, select ‘Discoverize Module’ and type Feature in the Name textbox (unfortunately, there seems to be no easy way to automatically put the name of the folder inside that textbox): This step will generate three code files, that are the backbone of every module: FeatureConfig.cs, FeatureModule.cs, and FeatureViews.cs. Finally, our multi-file item template comes into play! Handling the multi-file template The multi-file item template for a new module consists of four files: the template definition file Module.vstemplate and the template code files Config.cs, Module.cs, and Views.cs: Those four files have to be packed into a zip file and copied to a folder underneath %UserProfile%\My Documents\Visual Studio 2010\Templates\ItemTemplates\ – I put this one into Visual C#\Code. That’s how it appeared under the Visual C# –> Code section in the Add New Item dialog. Since it is somewhat cumbersome to zip and copy updated versions of the template (especially during early development where I keep adjusting and tuning the template code), I put together another batch file that does that for me. It basically does three things: Get the name of current folder to use as name for the zip file (found the solution here) Use 7-zip to zip the four files. Copy the zip file to the VS custom template directory. The real script contains some safety nets and more output so that in case it won’t work across all developer machines I can get quick feedback as to what exactly didn’t work instead of just “it didn’t work”. Happy Coding!

Code evolution + LINQ

by Oliver 28. June 2011 01:27

Three year old code: 1: protected string CpeBehaviorIds() 2: { 3: var cpeIds = ""; 4:  5: var helpItems = GetHelpItems(divGlobal); 6:  7: foreach (var helpItem in helpItems) 8: cpeIds += helpItem.CollapsiblePanelBehaviorID + ','; 9:  10: // remove comma at end 11: if (cpeIds.Length > 0) 12: cpeIds = cpeIds.Remove(cpeIds.Length - 1); 13:  14: return cpeIds; 15: } 16:  17: protected string CpeExpandIds() 18: { 19: var cpeIds = ""; 20:  21: var helpItems = GetHelpItems(divGlobal); 22:  23: foreach (var helpItem in helpItems) 24: cpeIds += helpItem.CollapsiblePanelExpandID + ','; 25:  26: // remove comma at end 27: if (cpeIds.Length > 0) 28: cpeIds = cpeIds.Remove(cpeIds.Length - 1); 29:  30: return cpeIds; 31: } 32:  33: protected static List<HelpItem> GetHelpItems(Control control) 34: { 35: var idList = new List<HelpItem>(); 36:  37: if (control is HelpItem) 38: idList.Add(control as HelpItem); 39: else 40: foreach (Control child in control.Controls) 41: idList.AddRange(GetHelpItems(child)); 42:  43: return idList; 44: } New code: 1: protected string CpeBehaviorIds() 2: { 3: return divGlobal.Controls<HelpItem>().Select(h => h.CollapsiblePanelBehaviorID).JoinNonEmpty(","); 4: } 5:  6: protected string CpeExpandIds() 7: { 8: return divGlobal.Controls<HelpItem>().Select(h => h.CollapsiblePanelExpandID).JoinNonEmpty(","); 9: } 10:  11: public static string JoinNonEmpty(this IEnumerable<string> values, string separator) 12: { 13: return String.Join(separator, values.Where(s => !string.IsNullOrEmpty(s)).ToArray()); 14: } LINQ – we love you! Oliver P.S. Controls<Type>() is another extension method defined like this: 1: /// <summary> 2: /// Returns all controls of the given Type that are found inside this control. 3: /// Searches recursively. 4: /// </summary> 5: public static IEnumerable<T> Controls<T>(this Control control) where T : Control 6: { 7: var controls = control.Controls; 8:  9: if (controls.Count == 0) return new List<T>(0); 10:  11: var newColl = new HashedSet<T>(); 12: foreach (Control child in controls) 13: { 14: if (child is T) 15: newColl.Add((T) child); 16:  17: var childColl = child.Controls<T>(); 18: foreach (T ctrl in childColl) 19: newColl.Add(ctrl); 20: } 21:  22: return newColl; 23: }

Sorting some result list by the ids of another list

by Oliver 24. June 2011 21:47

Imagine we have the following list of ids of some kind of objects and a mapping of some of the ids to some values (also int’s here): 1: var ids = new List<int> { 6, 2, 5, 3, 7 }; 2: var valueMap = new List<int[]> { new[] { 5, 15 }, new[] { 2, 12 }, new[] { 3, 13 } }; Now, if we want to return the results a.k.a. the valueMap in the same order we have the ids, we can use the LINQ extension method .Join()like this: 1: var joined = ids.Join(valueMap, id => id, arr => arr[0], (id, arr) => string.Format("id: {0} - value: {1}", id, arr[1])); Really convenient! Let’s look at the output: 1: Console.WriteLine(string.Join("\r\n", joined)); 2:  3: // Prints: 4: // id: 2 - value: 12 5: // id: 5 - value: 15 6: // id: 3 - value: 13 By the way, I use FastSharp to write and test this kind of small code snippets Happy Coding, Oliver

Mocking for the first time – a short overview of a few .NET mocking frameworks

by Oliver 25. March 2011 15:50

In my recent post Testing: trying to get it right I mentioned that a lot of our tests are of the dirty hybrid kind, somewhere between real Unit tests and real Integration tests. Focusing on the Unit test part, we’re looking into using a mocking framework right now to change the way we write tests – most of all to decouple the different components that we use in the application under test. Wanting to use the fresh and hype NuGet package manager to install the mocking frameworks, I chose among the ones that were both available there and also looked promising: RhinoMocks: the sample found in the introduction did not look at all inviting to me so I dropped this one Telerik JustMock (Free Edition) Moq FakeItEasy Really, it could not have been easier to get all these libraries into the project than using NuGet! Sample code So I went ahead and wrote a short and simple test just to get a feel of the syntax they offer. At first I wanted to mock a simple Repository using its interface. Here is what I ended up with: Telerik JustMock: using syntax from their quick start manual 1: public void MockRepositoryInterfaceTest() 2: { 3: // Arrange 4: var repo = Mock.Create<ITodoRepository>(); 5: Mock.Arrange(() => repo.Todos) 6: .Returns(new List<Todo> {new Todo {Description = "my todo"}}.AsQueryable); 7: 8: // Act + Assert 9: repo.Todos.ToList().Count.Should().Be.EqualTo(1); 10: } Moq: just visit the project homepage 1: public void MockRepositoryInterfaceTest() 2: { 3: // Arrange 4: var repo = new Mock<ITodoRepository>(); 5: repo.SetupGet(rep => rep.Todos) 6: .Returns(() => new List<Todo> {new Todo {Description = "my todo"}}.AsQueryable()); 7: 8: // Act + Assert 9: repo.Object.Todos.ToList().Count.Should().Be.EqualTo(1); 10: } FakeItEasy: just visit the project homepage 1: public void MockRepositoryInterfaceTest() 2: { 3: // Arrange 4: var repo = A.Fake<ITodoRepository>(); 5: A.CallTo(() => repo.Todos) 6: .Returns(new List<Todo> {new Todo {Description = "my todo"}}.AsQueryable()); 7: 8: // Act + Assert 9: repo.Todos.ToList().Count.Should().Be.EqualTo(1); 10: } In the first test-ride FakeItEasy and JustMock look pretty much identical, whereas the syntax Moq offers is a bit awkward with the SetupGet() method name and the need to call repo.Object to get the instance. I hope to examinate further differences in use as the project moves on. Mocking concrete classes Since we’re working with a large application that still has a lot of services and classes not implementing any interface I also wanted to make sure we’ll be able to mock concrete types. Well, this didn’t go so well: JustMock and FakeItEasy simply returned an instance of the concrete class I gave them, Moq complains that it can’t override the Todos member. So I added the virtual modifier to it and the test is now green. Still, I got the impression that I was trying to do something that I shouldn’t. The following blog post motivates further not to mock concrete classes: Test Smell: Mocking concrete classes. So I guess introducing interfaces as a kind of contract between classes is the way to go, but in the meantime and where we can’t avoid mocking concrete types we’ll be left using Moq. That’s it for now, happy coding! Oliver

Testing: trying to get it right

by Oliver 9. February 2011 10:54

Read a great post on Steve Sanderson’s blog with the promising title Writing Great Unit Tests – and it is definitely worth reading. He mentions another post with the title Integration Testing Your ASP.NET MVC Application which I also recommend. One of the eye openersfor me was this quote in his post: “TDD is a design process, not a testing process”. Let me elaborate: “TDD is a robust way of designing software components (“units”) interactively so that their behaviour is specified through unit tests.” I must admit that I haven’t read much yet about TDD – but we’ve been writing tests for quite some time now. Unfortunately, most of them probably fall into the Dirty Hybrids category that Sanderson sees between two good ends, one being True Unit Tests, the other being Integration Tests. I allow myself to add his illustration here: So it looks like our goal should be to write tests in either of the outer categories and slowly but surely get rid of the time consuming, easy-to-break hybrid tests. One problem that a lot of people writing web applications are confronted with at some point, is testing the whole application stack from browser action over server reaction to browser result. We’ve put some effort into abstracting away the HttpContext class to use the abstraction in both our frontend and in tests, but it falls short of being a worthy replacement for the real HttpRequest and HttpResponse classes. With all that works we’re missing a possibility to use our abstraction in a third party URL Rewriting engine, so the requests we are testing never get processed by it. We have tests for the rules that are applied by the engine, but for more sophisticated setups this simply is not enough. Thanks to a link on Steve Sanderson’s blog post on integration testing ASP.NET MVC applications I stumbled upon Phil Haack’s HttpSimulator – and it looks just like the piece in the puzzle we’ve been missing for all that time. (I have no idea how we didn’t find that earlier.) Another thing I’m new to is kata. Kata is a Japanese word describing detailed choreographed patterns of movements practiced either solo or in pairs. A code kata is an exercise in programming which helps hone your skills through practice and repetition. At first, it might sound weird to practice and repeat the same exercise over and over again. But then again, if we think of musicians or sportsmen it’s not hard to see that they become great at what they do only by practicing. A lot. And they practice the same move (let’s just call it that) over and over again. The idea behind the code kata is that programmers should do the same. This is what Dave Thomas, one of the authors of The Pragmatic Programmer, also promotes on his blog. I stumbled upon a very interesting kata in a blog post by Robert Martin about test driven development, which deals with the evolution of tests and poses the assumption that there might be a kind of priority for test code transformation that will lead to good code: The Transformation Priority Premise. The kata he mentions further down is the word wrap kata. The post is rather long so count at least 15 min to read it. For me it was well worth it. The solution to a very simple problem, at least a problem that’s easy to explain, can be quite challenging – and the post shows hands on how writing good tests can help you find a good solution faster. It showed me (again) that writing good tests lets you write good code. Just like the quote above states: TDD is a software development process. Happy coding – through testing ;-) Oliver

Automatic deployment of an ASP.NET Web Application Project with TeamCity and MSBuild

by Oliver 21. January 2011 20:19

We recently updated one our largest project to use ASP.NET 4.0, and for this matter the new Package/Publish feature including sub-web.configs which is meant to supersede the Web Deployment Project. For a manual deployment there’s a good write-up on the msdn library titled ASP.NET Web Application Project Deployment Overview which shows how and where to set this up. In our case this was not satisfactory because our deployment process is a bit more complicated. We push our changes to a central repository and use JetBrains’ continuous integration server (CIS) TeamCity Professional, which is totally free for our project size, for a continuous integration process. Once TeamCity has pulled and tested the current version, it is supposed to deploy this version to our staging server where we further test the complete site. The key point in an automatic deployment was the management of the different web.config files for the different environments our project is running on. Unfortunately, until yesterday every deployment that included changes to the web.config file – even to the staging server - required a manual step of editing the web.config that live on our staging system (outside of source control!). What we used to do: after a successful build on our CIS we simply copied the web application (files) to our staging server! But as Scott Hanselman wrote: If You're Using XCopy, You're Doing It Wrong! This post inspired us to move along and take advantage of the new possibilities that we were given. In the meanwhile, before switching to .NET 4.0 actually, we also took a shot at the Web Deployment Project way of doing things but never actually got that far as to fully automate the deployment – somehow the setup was not as easy as we hoped. Anyway, we wanted web.config Transforms! So what does our setup look like and what did we want to do?   During local development and testing I use a web.config file that talks to a local DB instance and has some more specific settings. To run the web application on our staging server we need to replace certain values or whole sections in the web.config. For this transformation we use the sub-web.config files, one for each build configuration: Now, with all of these web.config files the simple XCOPY deployment we used to use does not work any longer. We need to trigger the web.config transformation on the build server and then deploy the whole application. As easy as this looks using the built-in menus and dialogs in Visual Studio – it took me quite a while to find how to do this in an automated build, more concretely from the command line. After unsuccessfulle skimming stackoverflow.com for a solution I finally tripped over this very informative blog post on publishing a VS2010 ASP.NET web application using MSBuild. Admittedly, the author focuses on how to publish on the local machine as it’s yet a different process but towards the end he posts the solution I was looking for: 1: msbuild Website.csproj "/p:Platform=AnyCPU;Configuration=Release;DesktopBuildPackageLocation=c:\_Publish\stage\Website.zip" /t:Package This was it! After running this on my machine with my own settings I looked into the folder with the zip file and found the following 5 files: At first I just wanted to take the zip file, copy it to the staging server, unpack it – done! But then I peaked into it… and deeper… and deeper… and… still deeper… until I finally saw our application files underneath this directory: This has got to be one of the longest paths I’ve ever seen and used! How would I automate the extraction of web application files from the zip with such a path? I was already seeing myself hacking away on the command line… But wait: what about those files that appeared next to the zip file? A ci-stage.deploy.cmd and a readme.txt caught my attention – of course, I opened the cmd file first :-D Well… maybe the readme file gives me a shortcut to understanding this and the rest of the 190 lines: Looks promising! I convinced myself to give it a shot. So we set up a new configuration in TeamCity with the following settings: These settings reflect the command line from above with a few minor changes (removed the DesktopBuildPackageLocation and set the /v[erbose] switch to m[inimal]): msbuild Website.csproj "/p:Platform=AnyCPU;Configuration=Release" /t:Package /v:m The second step is to use the generated script, ci-stage.deploy.cmd. I recommend to run the script by hand once using the /T switch just to make sure everything looks alright. In our case we found out that the deployment of the package would have deleted a lot of files, most of all images, from our website. This was not what we wanted! After a quick search I found this question on stackoverflow.com: MSDeploy: “Leave extra files on destination” from command line? So I added this switch to the parameters list in the build step configuration as follows: That’s it! This is all we need on the command line to generate a package that is ready for deployment on the staging server. There are a few more necessary settings, including the DesktopBuildPackageLocation, that can be found in the Package settings window inside the project properties of the web application project: the DesktopBuildPackageLocation can be set here instead of on the command line the website and application name on the destination IIS server that this package should be installed to some more options like excluding debug symbols etc. These settings are saved in the project file and will be used during generation and deployment of the package. That’s all I have to say right now. Happy Coding, Oliver

Zu viel Konstruktor-Injizierung!?

by robert 10. June 2010 10:02

Ist zu viel Konstruktor-Injizierung ein Anti-Pattern? In einem Blog-Post konstruiert Jeff Palermo ein Beispiel, in dem eine injizierte Komponente eine wirklich langsame Initialisierung hat. Da die Komponente nur von einer Methode der Klasse benötigt wird, leitet er ab, dass statt der Konstruktor-Injizierung von Komponenten eine Lazy-Factory die bessere Alternative ist. Die Kommentare lehnen diesen Schluss fast fast einhellig  ab, denn: 1) Eine Verzögerung der Initialiszierung ist auch über den Container und mit .NET 4 auch über System.Lazy<> zu erreichen. 2) Gegen eine Schnittstelle zu programmieren bedeutet gerade, dass man Implementierungsdetails nicht in der Hand hat. 3) Langwierige Initialisierung in der Komponente bedeutet, dass die Komponente minderer Qualität ist, hier muss dann nachgebessert werden. Es sollten keine aufwändigen Operationen in der Konstruktor-Implementierung durchgeführt werden. 4) Ein Abstrakte Fabrik erzeugt ein Abhängigkeit gegenüber dieser, was die Menge der Abhängigkeiten steigert und die Vorteile des IoC reduziert. Jedes „new“ ist bad „news“. 5) Um häufige Initialiserung zu umgehen, könnten Services als Singelton konfiguriert werden, wobei globale Konfiguration ein Problem sein kann, weil sie schwer zu Debuggen und auf Anwendungsebene auch schwer nachzuvollziehen sein kann. Danke an Jeff Palermo für seinen sehr lesenwerten Post! Links: http://jeffreypalermo.com/blog/constructor-over-injection-anti-pattern/

GEN0 und GEN2 Heap gegenläufig?

by robert 3. June 2010 19:48

Gen2 und Gen0 sind deutlich gegenläufig, versteh ich nicht, hat jemand ein Erklärungsmuster? Fängt der Garbarge-Collector an in Gen2 aufzuräumen weil viel neuer Speicher allokiert werden muß? Bild: Speicher für Camping.Info Server: (8 GIG RAM, 2x Quadcore, ca.: 1000 aktive Sessions, Requests per Second: Avg.: 50, Peak: 120) Das 10x mehr Speicher in Gen2 liegt bedeutet natürlich, dass wir unser Memory-Leak immer noch nicht aufgespürt haben. Normal müsste Gen 2 kleiner sein als Gen 0? Viele Fragen :-) (Robert)

ALT.NET.BERLIN Zweiter „Coding Dojo“ und erster „OPEN-TABLE“

by robert 10. May 2010 20:39

Die Berliner User Group „ALT.NET-BERLIN“ hat Ihren zweiten „Coding-Dojo „durchgeführt (und das 14. Treffen). Sehr lesenswert ist die Aufbereitung des Coding-Dojos durch Mike. Ich konnte wirklich viel aus dem Treffen und dem Bericht ziehen. Jedem der kann, sei ein Besuch des nächsten Dojos sehr ans Herz gelegt! Morgen am Dienstag den 11.05.2010 findet unser erster Open-Table statt. Thema sind Design-Patterns. Die Agenda ist hier zu finden. Wer über ALT.NET.BERLIN Termine auf dem Laufenden bleiben möchte, dem sei die XING-Gruppe .NET Berlin empfohlen, die Marco heute gegründet hat :-)

About Oliver

shades-of-orange.com code blog logo I build web applications using ASP.NET and have a passion for javascript. Enjoy MVC 4 and Orchard CMS, and I do TDD whenever I can. I like clean code. Love to spend time with my wife and our children. My profile on Stack Exchange, a network of free, community-driven Q&A sites

About Anton

shades-of-orange.com code blog logo I'm a software developer at teamaton. I code in C# and work with MVC, Orchard, SpecFlow, Coypu and NHibernate. I enjoy beach volleyball, board games and Coke.