Serializing A PagedList Using JSON.NET In ASP.NET MVC – Gotcha And Workaround

Recently I discovered that there’s no one standard way for AJAX-driven server-side paging in ASP.NET MVC (in Web API, you can expose an IQueryable). For the case in hand, I decided to use PagedList for the server bit of the game.


The PagedList interface looks a bit like this (for demonstration only, real code is a bit different, check its source code for the real stuff):

It provides nice properties for paging, and exposes itself as enumerable and has an indexer. Apart from this snippet, the library also provides an extension method ToPagedList() to apply to any enumerable and allow it to populate the properties from it (by enumerating on it, and by calling the Count() method).

We were also using JSON.NET for JavaScript serialization, which is pretty much defacto standard nowadays.

The JSON.NET Serialization Problem

JSON.NET has a nice implementation, if you serialize a class that implements IEnumerable<T>, and you don’t have a special treatment rule for it (via what JSON.NET calls “Converter” classes), when you serialize an instance of the class to JavaScript, it will be serialized as a JavaScript array, where the enumerated objects are the contents of this array. Makes lots of sense right?

Well, yes, except when you have a custom collection like PagedList, and you want to treat it as an object that has several properties, not as an array. JSON.NET does provide a solution for this actually, all you need to do is apply the [JsonObject] attribute to your class.

Unless you don’t own the source code of this class.

In this case, you need to inherit from it. By doing this, I lose the nice ToPagedList() extention method (because it creates an object of the PagedList class directly), but luckily it does nothing but calling new PagedList() with the parameters we give it, so, we don’t lose much.

Here’s how my implementation looks like:

Apart from having to copy the constructors to pass parameters to the base ones, have you noticed the extra Items property in there?

That’s because the Subset member it includes is actually a field, not a property, and JSON.NET won’t serialize that by default, I could override it somewhere else, but since I’m fiddling with this here, it made sense to just stick a property to the class.

Bonus: A Bit More On Implementation

In my actual code, I have added the Dynamic Linq NuGet package, and assumed a single property for sorting (which was fair for the situations where I intend to use this), so, I complemented the code above with another class that looks like this:

This allows the controller to instantiate an instance of the SerializablePagedList class, pass it all the way to whatever repository method I have.

The repository method will take it as a parameter, work out what IQueryable it needs, and instead of passing it to UI, it calls CreatePagedListFromQueryable(), which returns an innocent-looking PagedList object (because SerializablePagedList inherits PagedList) that the repository can pass back to the controller, which can serialize it to JavaScript without a problem, then the rest is all JavaScript code to work out how create the paging parameters, and how to use the returned paging hints.

Even more, now that I think about it, maybe I should change the return type to SerializablePagedList, to make the Items property visible to the developer (because they’d think it’s magic, and in coding, magic is BAD). I’ll leave this as an exercise for you :)

Final Words / Disclaimer

The motivation behind this post is that I found the problem of serializing PagedList using JSON.NET a challenge and I wanted to help others work it out faster than I did. Is this how I’d recommend doing paging? Well, I don’t know, but if it’s what you choose, I hope that I have saved you some time.

And more importantly, is it good enough to be the defacto standard I mentioned I was after in the beginning of the post? Not really. I think it’s not bad, but definitely not the best. I’d love to see less clever (read: hacky), and more simpler solutions.

My Quest To Find The Best NodeJS Http Client With HTTPS and Basic Auth

This may be a completely useless post, but it took me a bit to figure out some of the pieces until I got here, so, I thought maybe it’s worth sharing!

What’s this & how did I come by it?

Instapaper is a nice service that provides you with “read it later” bookmarklet. You use this bookmarklet to add any web page to their log and you can read that page later via their site, a mobile application (from them or 3rd party), or via Kindle (or other book readers)!

Apart from their bookmarklet, they provide a simple API to add URLs programmatically. This is useful if you for example want to extract URLs from other services, say twitter favourites. They also provide a Full API (OAuth API) for doing everything their site and apps do, except some things in this so called Full API are (at least to be) only for paid subscribers.


Instapaper’s simple API is quite interesting though, according to the docs, you can use HTTP and provide only username (email) -not even password- to add a URL, and you can use POST or GET. That’s scary for anyone who thinks about security, but -maybe- OK for what the API does. They also provide the API in proper HTTPS with basic HTTP authentication (and if you want more, you can always use the Full API’s OAuth).

I wanted to use their API in HTTPS using POST and Basic Authentication. In my tests I was happy to try HTTP and/or GET and/or sending username in clear-text. I also thought I’d write this in Node because I thought it should be easy. Node people often serve REST/HTTP APIs, so, they must be good at consuming them too (even though in reality most consumers are client JS not node), right? Well, maybe yes.

The Code

I can explain the challenges, how I overcame them, and then show the final code, but I have spoken too much already, let’s begin with the code.

var basicAuthenticationParams = {
username: ‘MY_EMAIL’,
password: ‘MY_PASSWORD’

var requestParams = {
url: ‘’
};“”, {
form: requestParams,
auth: basicAuthenticationParams
function (error, response, body) {
// This is only returned for network errors, not server errors
if (!!error) {
console.log(“Network Error:”);
throw error; // it’s actual JS Error object not string

var statusCode = !!response ?
parseInt(response.statusCode, 10) || null
: null;
// I could have checked for 200,
// but technically 200-299 are success codes,
// and more importantly, Instapaper sends 201 for success
if (!!statusCode && statusCode >= 200 && statusCode < 300) {
console.log(); // Extra line separator

console.log(“HTTP Status:”);
console.log(); // Extra line separator

// In instapaper’s case, by default,
// they just send status code in body too


See? It was all very straight forward. Well, it wasn’t for me. I tried several NPM packages to get this effect, and none was going well. The “request” package was the one that did the trick. this is generally accepted in Node, the built-in API is very low level and there is some NPM package that gives you the nicer one.

For all the other packages I tried, they didn’t have any dependencies on other packages. This “request” package gave me the usual shock seeing so many NPM packages downloaded and installed, but this is the philosophy of NPM, build small reusable pieces built on top of other pieces.

Getting the request headers right (Oh my!)

While the packages were not working, interchanging HTTP status codes between 403 (invalid credentials) and 400 (missing parameters), it was very hard to tell what I needed. Even associating to proxy to get the output to Fiddler didn’t go well with for example the “easyHttp” package (there is a similarly-named NuGet package for .NET BTW, but it’s not related at all I think).

I went to my favourite Chrome Postman extension, which works nicely like Fiddler for the “compose request” part (actually, arguably is even way easier). The request parameters (was using username as another request parameter, not using Basic Authentication, just to test) didn’t seem to work until I changed the default “form-data” to “x-www-form-urlencoded”


Which added the following header to the request: “Content-Type: application/x-www-form-urlencoded”. This kind of makes sense. This overly permissive API feels more optimized for sending from web-pages, where forms are a front concept.

I tried to add the header when I was using other packages but that didn’t seem to tick it. I tried this one, and as you can see didn’t even need to set that. I just used the “form” property of the options though instead of “body” or “”json” properties.

This is why I’m posting this, to hopefully help someone googling get up and running quicker than me. This is not just Instapaper, this can be very helpful in so many situations.

For the rest of you (.NET / non NodeJS devs)…

If you are not a Node person but still want to have a fiddle, after you install Node (which includes NPM package manager by default), then you save the code to a JS file “some-file.js”, preferably in a new folder.

Assuming Node and NPM are in the PATH (installer default), you can open Cmd or PowerShell console in that folder and run npm install request then node some-file.js (no quotes in both), and see the results in front of you.

My 1st YouTube Video: Angular.JS Directives Demo

So, my wife has been encouraging me to get this starting over the past 4 years or so. I was more busy with offline groups at first, followed by traveling to work in different countries, and then later all sorts of new stuff I wanted to learn around .NET, software craft in general, and web development and trends. At last, I started, and uploaded my first YouTube video.

The Video

To make sure I actually get to the point where I have something to upload, I used the Minimum Viable Product approach. I looked for a simple topic that I still think you guys might be interested in, I spent some time on editing the video but still didn’t wait till I’m 100% happy about the quality (particularly of my own voice) before publishing. The idea is that if I continue to do this I might be able to get more out soon and practice will naturally increase the quality. Was that the right way to do it? Only your feedback can tell!

Simple Angular.JS Demo: Using HTML Directives Even Without JavaScript Controllers

Now It’s Up To You!

As you may expect, I’m really looking forwards to all sorts of feedback on this video. Let me know whether you liked it, and what would you’d like to see in future videos. If there is any question, problem, or idea that you’d be interested in seeing me talk about it, please let me know.

Thanks a lot.

Easy Ways To Discover Feed URL of A Website / Page

All blogs and news websites provide some sort of aggregation feed, usually RSS or ATOM. This allows users to add the feed URL to their favourite aggregator and stay updated with future stuff when they come. This post shows how to get a URL to subscribe to, and how to get multiple URLs if the site provides multiple formats.

A .NET library

One easy way to do this in .NET is using the Argotic Syndication Framework library, the code will look like:

Here is what you get when you run the code against


A few notes on this approach:

  • You must have realised the `Where` check in the code, the library seems to capture any `related` link in the HTMl, not just syndication links. that’s why we needed to filter them explicitly
  • Quite often when you have a main site that has different branches, you get more than one feed link, for CNN for example you get different feeds for certain site languages, for you got one for the site itself and another for members of the service. Arguably, this is not always what you want when you add the site to a reader kind of application
  • As expected, this code is quite slow in debug mode, takes about 1.5 seconds to run alone! In release mode (build configuration set to “Release” and so web,config)


Google API

In its simplest form, the syndication discovery is a matter of finding a `link` tag with a proper `rel` attribute (typically set to `alternative`), and a `type` attribute holding the attribute, however, in real life, at least historically, there used to be many variations of the way the discovery was implemented (read the next section for more).

One of those who managed to get right URLs for different edge cases was Google Reader. Apart from Google Reader itself, whose closing was part of the reason I wrote this post, Google allows you to use their systems to get the right syndication feed URL of a given page via simply calling a public JSONP API.

This is the structure of the `data` parameter returned by the previous call:


To learn more about this specific API, check:

To learn what’s special about JSONP requests and why jQuery needs to treat it differently, read the $.getJSON() documentation:


Background Of The Problem

Even though social media has made people depend on links shared on social media sites (by their peers, or the creators of the feed), the trend of adding a syndication feed to website is a trend that continued to increase in many product and subscription websites, especially that it’s easy to automate social media posts from the feeds after that.


As Google Reader will be retired in July, I thought for a minute about what it’d take to put some web based reader together. This was before I learned about the existing awesome alternatives like Feedly and so many others.

Then I remembered there was an application I was working on in 2007, one feature we needed and a colleague worked on was getting RSS posts from personal blogs of the site members. I remember seeing him doing all sorts of crazy Regular Expression matches of so many formats to get the URL. Turns out at least at this time different blog providers used different ways to advertise the feed URL in blog homepage markup, there were so many cases, it took my colleague several days to cover a large set of test cases from different providers that we knew our users were using.

I wanted to see whether this problem was still a thing n 2013, and tried to see what options we had, hence came this post, you know, just for fun :).

Hope some of you were interested in this too!

Mapping José Romaniello’s Review Domain Using FluentNHibernate

This article is the third in a series of articles not written by me, but by José F. Romaniello. He is a big NHibernate guy, so, he created a sample domain trying to evaluate how close the latest Entity Framework 4.1 Code First stuff is getting to NHibernate features,

Later, he chose to show how to do the same Code-First mapping using NHibernate and confORM, NHibernate mapping library that is created by Fabio Maulo, a primary developer in NHibernate source code.


I asked him to do it also with FluentNHibernate, so, he took the time and effort to create a nicely put visual studio solution. at some point he gave it to me as I was familiar with FluentNHibernate in general, not with automapping, which we wanted to use for this sample. so, I am now posting about this experiment.

This article’s very late than it should. Apologies to those who have been waiting.

Convention Based Mapping, AKA, Automapping

My audience is slightly different than Jose, I might need to explain this one. skip if not needed.

When you do Code-First (or Domain First with POCOs, or your favourite name), normally you need to define the mapping between you classes and DB tables class-by-class. In each class you define which table(s) it maps to and which properties correspond to which columns, etc..

Your mapping library, being Entity Framework 4.1 Code-First bits, NHibernate XML/HBM mapping bits, confORM, FluentNHibernate, etc… can give you some help with that by having some defaults for mappings. For example, assuming the table name is the same as the class name, properties map to columns with the same name, assuming your class a single column PK and is called “Id” or “<ClassName>Id’, etc..

Of course they might also allow you to modify those defaults to your likings, and to have exceptions to specific classes/columns to those defaults (conventions). The purpose still is to need as few such customizations as possible.

This is what was first introduced in FluentNHibernate as Automapping, and is also starting to get known as convention based mapping while other libraries apply it.

The Domain

Let me steal borrow the class diagram Jose had for his test/review domain:


The Libraries

All needed to do was a NuGet command that brings FluentNHibernate and all its dependencies:


Remember, you don’t need to do it from the PowerShell console, right click the project and “Add Project Library Dependency” and finding the FluentNHibernate library in the wizard (select “Online” from the left and type FluentNHibernate in the top right text box) will simply do.

I did this just to show the dependencies easier in the picture. (note also I already had this in my project, so, created a new project “CleanFNhSample” just to show you this).

There is another dependency that i needed, not just because Jose used (and created)., but also because it’s really convenient also. NHibernate by default uses custom types for collections of type “set”, because those were not supported before .NET 4.0.. Jose has a code file NuGet package that makes using the built-in .NET 4.0 HashSet type instead of the custom type possible. it’s called “NHibernate.SetForNet4”:


Like the other packages, I could have added this from console.  just showing you both.

The Application Code

I’ll be showing snippets of the code here, then later giving you a GitHub URL to play with it as you wish Smile.

The main test code Jose created uses the domain classes in the diagram above is pretty simple. It creates a new NHibernate (NH) configuration (including class mappings to database), uses the mappings to create schema in the database (replacing anything that might exist, yes. This is not the only option), and creates an NH session (equivalent to EF DataContext if I over-simplify it), creates some records, and tries to save them to DB.

private static void Main()
    Configuration = ConfigureNHibernate();
    Console.WriteLine("Generating the schema");
    new SchemaExport(configuration).Create(true, true);
    Console.WriteLine("Persiting some objects");
    using (ISessionFactory sf = configuration.BuildSessionFactory())
    using (ISession s = sf.OpenSession())
    using (ITransaction tx = s.BeginTransaction())
        var product = new Product
                              Name = "Fideos",
                              Customizations =
                                      new Customization
                                              Name = "Tuco",
                                              PossibleValues = {"Pocon", "Medio", "Sopa"}

Now before we get into “ConfigureNHibernate()” method which is mainly the most important method here, let’s look at some other classes…

Project Structure

This is how my project looks right now:


The only difference from Jose’s confORM review and even his original FluentNHibernate work is the “Mapping” folder. While the few ConfORM samples used to have different methods in the same file to do the mappings, FluentNHibernate people seem to prefer (or I think they do) having a different class per part of the mapping. Of course you can still have those classes in the same file and no one would even complain!


So, there are 3 FluentNHibernate classes: in there. We’ll see why those are needed after we look at the main configuration. Note that not all this configuration is required as we were trying to meet specific requirements that were met in confORM demo.

Main Configuration

Each part exists for a specific purpose that in mentioned in the comments above it.

private static Configuration ConfigureNHibernate()
    FluentConfiguration fluentConfig = Fluently.Configure()
            m => m.AutoMappings // Use mapping by convention.
                     // Map all types in current assembly
                     //     using default options from StoreConfiguration class.
                     .Add(AutoMap.AssemblyOf<Program>(new StoreConfiguration())
                              // Add our specific defaults for Cascade, etc...
                              // Add 'Order' class exceptions to the convention.
                              // The line itself adds any conventions in assembly,
                              //    which is not the only option we have.
                              .UseOverridesFromAssembly(typeof (OrderOverride).Assembly))
                     // Display the mappings XML in the console.
                     // Typically you should do little XML stuff to output that.
                     // It can also output to text file by specifying path.
        .Database(() =>
                  MsSqlConfiguration.MsSql2008 // Use SQL Server 2008 Dialect
                      .ConnectionString( // Use connection string from app/web.config
                            c => c.FromConnectionStringWithKey("NHibernateTest"))
                      .ShowSql()    // Display SQL generated by NH in console.
                      .FormatSql() // Format / Tabify SQL in console for readability.
                      // Auto escape names of tables for safe naming.
                      //    In SQL Server, this uses '[', and ']' not quotes.
                      // It's worth mentioning: 'Raw' allows us to add config
                      //     values as key/value strings like XML mappings,
                      //    and Environment Hbm2DDLKeyWords are NH classes not FNH
        // Use Jose's collection factory.
        // This enables us to use .NET 4.0 HashSet not custom Set types.
    // All this was done using Fluent NHibernate object to build config
    // So, now build the actual NHibernate Configuration object.
    return fluentConfig.BuildConfiguration();

I hope the comments are enough. We need to set some mapping rules in StoreConfiguration class, apply some non-default standards for our collections in CollectionConvention class, and some exceptions for a field of Order entity in the OrderOverride class.

I know you might be worried about the big method, and the fluent API that makes the whole thing is one statement. From my experience, this makes debugging each part of the statement that might fail a problem, but this is not the case here, because, NH configuration / mapping problems only show when you build configuration or create session factory anyway. It doesn’t fail on the individual steps. But don’t worry about that too, usually (but not all the time), when you navigate the InnerException (sometimes recursively through multiple inner exceptions), you get specific information what part was wrong and maybe what was wrong about it too.


FluentNHibernate Classes

The way you write conventions, overrides, and other kinds of FluentNHibernate stuff is by creating classes that implement certain interfaces or inherit from certain base classes. There are so many interfaces provided but you only need to implement the ones that allow you to provide other defaults than FluentNHibernate own defaults. Also, a lot of the features in the classes could have been added to configuration method, but you don’t want it to grow even bigger, at least not for this explanation sample!


StoreConfiguration Class

public class StoreConfiguration : DefaultAutomappingConfiguration
    public override bool IsConcreteBaseType(Type type)
        return type == typeof (EntityBase);
    public override bool ShouldMap(Type type)
        return type.Namespace == typeof (EntityBase).Namespace
               && type.IsEnum == false
               && type != typeof (EntityBase);

In this class we do not implement an interface, instead, we derive from the default mappings class, which defines all defaults in automapping. This one was originally created by Jose and for myself I’m not sure whether the typical use is by having such a class or doing it directly in configuration.

What this method does is specify which classes represent DB entities and should be mapped automatically. It tells FNH to map all types in the EntityBase class namespace, except Enums and except EntityBase itself.


If I understand correctly, the part that we should have needed is only the namespace part. in “ShouldMap” and the “IsConcreteBaseType” (which basically tells FNH to not try to create separate table for EntityBase and make all other tables reference it for their PK. Instead, it should treat EntityBase as part of its children classes themselves as if the “Id” property was implemented in each of the classes not in their base class). I believe we shouldn’t have needed to exclude the enum explicitly (if it’s there the configuration is invalid, not some easier effect like having a table for enum values).


CollectionConvention Class

public class CollectionConvention : ICollectionConvention
    #region ICollectionConvention Members
    public void Apply(ICollectionInstance instance)

As mentioned, conventions are applied by implementing I….Convention interfaces, and there are many of them to override whatever defaults you might want. In this example, if we wanted our collection convention to only subset of collections we could have also implemented “ICollectionConventionAcceptance”. You can implement as many I<SomeCriteria>Convention and <SomeCriteria>ConventionAcceptance in the same class as you like.


Here we tell NHibernate to apply cascading to all (you can only restrict it to updates or deletes or none, check NH docs). We also tell FNH that all collections we use will be of type “set”.  This is different from what we had in the configuration because the configuration part tells NHibernate what actual type to use when we want our collections to be sets, and that’s why it’s not per-class or per-collection (note the conventions are like mappings for each collection but done all at once), The convention here makes NHibernate understand to treat it as “set” and use the Collection Factory we provided in the configuration for dealing with sets.


Also note that usually I had to add extra code to check if the collection is collection of string or int or similar non-entity type and have the instance.Name(“Value”) or else it doesn’t work. Just while writing this article I tried it and found that I didn’t need to add it anymore (works without it normally).


OrderOverride Class

public class OrderOverride : IAutoMappingOverride<Order>
    #region IAutoMappingOverride<Order> Members
    public void Override(AutoMapping<Order> mapping)
        mapping.Map(o => o.Total).Access.ReadOnly();

If you have ever worked with normal ClassMap<TEntity> in FluentNHibernate, this one is going to be familiar. Having an interface instead of base class, and using an “Override” method instead of constructor, and having to use the parameter mapping (or whatever you call it) instead of calling “Map” methods directly are the only differences.

If you are not, then simply those overrides classes places to add mapping for the specific classes and properties that have special cases, without having conventions that apply to many other classes at the same time. Here we had a read only property in the Order class and we wanted to set its access (property, field, none, …) to read only. Jose and I thought it might be good example of overrides. This is also the place if you use special SQL formula for one specific property or so.


NuGet And Resources

Now, as promised, you can have the code on GitHub so that you can clone, fork, or simply download it and have a look.

The Code URL is:

I’d highly recommend you not only browse the code online, but also use TourtoiseGit or whatever client you may feel friendly (including console) to bring the code locally and start playing with it

However, if you insist on just old-school download-the-code style, no problem, here is the direct download link:

Sorry for the length of the post and I hope it wasn’t a show stopper that made you just skip to end.


For context, here are some links that were mentioned during the post:

The Easiest Way To Write Async Code – Reading #FunnelWeblog Code

This one is a Back-To-Basics style post. Last month, I was checking some code for the relatively new .NET open source blog engine, FunnelWeb, and noticed this bit of code:


            SendEmail(settings, commentDetails);

This is just a real easy way to make async call, right?


BTW, you can learn more about the ThreadPool.QueueUserWorkItem() method from MSDN here.


More Interesting Stuff

Playing with it after reading, I found that Matt Valerio seems to have very interesting takes on this method, making you use it in many elegant ways:


I highly recommend the first two articles especially, the code is really elegant.


What IS This Post???

I felt a strong desire to blog something before I go to work today and wanted to see if that’s possible.

Not sure if I should write more posts like this, or am I then converting the blog here to a micro-blog/twitter/tumblr of some kind.

You might help me by telling what you think in the comments.

Twitter OAuth, Persistent OAuth, TweetSharp: Presentation & Code Nuggets

This is a PowerPoint Presentation (and extraction of the contents) I made as per a couple of friends’ request (@EmadAshi and @AmrEldib) to show how OAuth works along with Twitter and how easy it is to cache OAuth credentials.

As I was doing related work for TweetToEmail. I felt a PowerPoint presentation will be even better than a blog post for this one, but here you get the two.

The Presentation

The Contents

Application Registration

  • A Twitter user creates a Twitter Application
    • If the application is web based, it needs to provide a URL. “Localhost” is not accepted as a domain for this URL
  • A Twitter Application gets two pieces of information
  • Consumer Key
  • Consumer Secret
  • A Twitter Application will use these in all coming requests.

Initializing The Process

  • User comes to the application and it decides to authenticate against Twitter
  • Application makes a request using Consumer Key and Secret to obtain “Oauth Request Token”, which consists of two parts
    • Token
    • Token Secret
  • Application makes authentication URL including the “Oauth Request Token” parameter, and optionally a “Call-back URL” (if different than default URL in first step)

User Authentication

  • The user is redirected to Twitter, the URL contains the “Oauth Request” to identify application authentication session
  • Assuming the Twitter User being logged in and authorizes the Application
    • If the application is a desktop application, Twitter gives the a user a number “Verifier” to manually write back to the application
    • If the application is a web application, the user is redirected back to the application call-back URL with a complex “Verifier” parameter in the URL

Obtaining the Access Token

  • The Application makes a request to Twitter including the “Oauth Request Token” and the “Verifier”
  • It obtains an “Access Token”, likewise it consists of two-parts:
    • Token
    • Token Secret
  • The application needs to send the Consumer Key and Secret and Access Token in every future request that needs the Twitter User privileges

Caching Credentials

  • The application needs at least one authorization process as before
  • The Access Token returned can be saved in session/DB/whatever and then re-used later
  • The application can later use the Access Token directly along with the Consumer Key / Secret to communicate with Twitter without going through any of the previous steps

Sample Code (TweetSharp v 2.0)

Request Token & Redirect


Getting Access Token


Hints for Web Applications

  • The method GetAuthenticationUrl() has an overload that accepts a call-back URL for the user to be redirected to after obtaining verifer
  • The important part in RequestToken is the Token part, not the secret.
  • All parts of AccessToken are important and required
  • When the user is redirected back from Twitter to your application, you get the following QueryString parameters sent to you
    • oauth_token: The Token part of the Request Token
    • oauth_verifier: The verifier required to obtain the Access Token later

Using Cached Access Token


Related Links

101 Free Tech Books Update – I Won A WCF 4 Print Book!


Five days ago I got a great email from 101 Free Tech Books. Seems the drawing I wrote about is real!!

Yes, I won a FREE print book. Filled in my shipping information yesterday and got the post that confirms my order was being processed. Feels so real! I’m even asked to give testimonials after receiving the book, which I will…

Which Book?


The book I have chosen is “Professional WCF 4: Windows Communication Foundation with .NET 4”. Sounded like a great title!

There is a trick in here. The option for books is only available from my wish list prior to the random drawing. I didn’t pay enough attention to this earlier, so, had a very small wish list of just “sample” books in it. Some of them I already had as ebooks. This was not very right.

I also tried to choose another book, add to my wish list, and go back to choosing again, but, as mentioned, only the books added prior to wining were there. Makes sense though!

You Can Win Too!

Now, I can recommend those people even more! They emphasize completely on showing how real this is, and I seem to believe them so far.

You can use my reference registration URL below and start adding people using your own URL so that maybe we both win more books!

There are 101 books to win EVERY MONTH.

Just remember, add as much as you can to your wish list now, and let the decision come later ;)

Good Luck :)

Just Noticed GitHub DOES Support SubVersion/SVN [Not only Git]

The SVN News

Today I was hanging around GitHub when realized a relatively old news, dated to April 1, 2010, saying they do support SVN.

Announcing SVN Support

Yes, it’s April Fools day. Very funny date to announce anything serious as they admit themselves in an update to the news post, but it DOES work.

Use the same Git clone HTTP URL, just add “svn.” between “http://” and “”:[user]/[repository]

It even allows you to write changes back to the repository, as announced in the more recent news post, dated May 4, 2010, check it out for the “cavets” (known issues):

Subversion Write Support

That uses the same URL but with HTTPS:[user]/[repository]

This should work best when you want to get some project for read-only access or very few commits from your side, when this project has a very long history you are not really interested in. Of course you wouldn’t want to use that if you are leading (or a main committer to) a project hosted at GitHub.

Background, Me and Git (Safe To Skip)

I have been playing with Distributed Version Control Systems (D-VCS) lately, not because it’s fun (it is), but mainly because many open source libraries I’m a fan of have converted from SVN to Git, most of them hosted at GitHub.

Although I feel geeky when dealing with Git (nice feeling), having to deal with its tooling was a bit unpleasant (I’m not against Console, but given there are alternatives with good GUI in other VCS, it felt bad), and having to get entire version history not last updates only for some projects with very big history was quite slow and bandwidth hungry thing.

I discovered the “-depth” switch in git clone command which allows getting last updates by checking TortoiseGit UI, but it still doesn’t play nice with pulling more recent changes after this. Best way is to drop what you have and re-clone the entire working copy.  This is bad not only for speed but also it removes some changes I usually need to do (and keep, those I mention at the end of the post).

Also tried TortoiseHg git and SVN modules thinking ti’s a better tooling, but I found that it doesn’t play nice with them also (the details aren’t at the top of my head right now).

What I’m Using It For

So, this news is very nice for me. That allows me to do the exact thing I needed. Get last version, modify local changes I’m not interested in merging with main project (like using key file with some VS project, or upgrading entire VS Solution version from VS2008 to VS2010, etc), update later when some remote changes happen, and get those merged with my stuff.

The clear example for this is NHibernate (hosted on SVN but I hear in uNHAddin mailing list there are early plans to convert to Mercurial). It allows me to do exactly that, and I was willing to do the same with FluentNHibenrate (hosted at GitHub). Now I can, YAY :)

Using Different VCS Clients?

To make it clear, this is not to get rid of Git completely, I think I need to get involved in it more. I used to like Mercurial more, but seems with time I’m getting into this git thing.

It’s nice there are many VCSes to choose from in general, but not nice that you cannot choose only one for yourself. Seriously hate the fact that I have to use different VCS clients at the same time, especially when that depends on what VCS the project uses, not what I happen to prefer.

Now CodePkex, GoogleCode, GitHub support SVN. That’s good for me. Will try to stay there as long as possible. TortoiseHg supports Git/SVN also via plugins, but I won’t stick to that as it’s not as great tooling as I need and extensions don’t always work right for different tasks.

Videos from NDC 2009: SOLID Principles, Legacy Code, WCF, Software Design,…

Here’s another email from the internal mailing list of Injazat .NET Ninjas (Ninjazat, AKA  as we call ourselves), that I’m sharing with blog readers as well.

Just a place holder, until I move one of my 18 (just discovered the number now – terrifying!) drafts in my Windows Live Writer into a published post, or delete them all!

Subject: [Learning] Some very interesting videos

Some videos from NDC 2009 event (Norwegian Developers Conference 2009) – about software design and related issues:

· NDC Video – Robert Martin – S.O.L.I.D Principles of OO class design

· NDC Video – Robert Martin – Craftsmanship and Ethics

· NDC Video – Robert Martin – Component Principles

· NDC Video – Robert Martin – Clean Code III – Functions

· NDC Video – Michael Feathers – Working Effectively with Legacy Code

· NDC Video – Jeremy D. Miller – Convention Over Configuration

· NDC Video – Michael Feathers – Seven Blind Alleys in Software Design

· NDC Video – Ted Neward – WCF Patterns

· NDC Video: Michael Feathers – Design Sense

For the complete list of videos from this event check videos from:

· Day 1

· Day 2

· Day 3

My favorite topics are (recommendations):

· NDC Video – Robert Martin – S.O.L.I.D Principles of OO class design

· NDC Video – Michael Feathers – Working Effectively with Legacy Code

Of course the other topics are interesting as well.