Skip to content

Inexpensive Windows Phone 8 Development Without the Emulator

For some time I’ve wanted to try my hand at Windows Phone 8 development, but I didn’t own a Windows Phone, and had no interest in switching to one for daily use.  I’m unable to use the emulator that is part of the Windows Phone SDK because I don’t have Windows 8 Pro, and even if I did, I’d also like to do some game development with Unity.  Emulator deployment is not supported by Unity at this time.  In any case, it’s always a good idea to test your apps on real hardware.

Fortunately, there is a solution that will resolve all of these problems for $99, which is the same as it would cost to upgrade Windows: no-contract phones.  The Nokia Lumia 520 and 521 phones are a cheap way to get your hands on real hardware, and you don’t have to pay a penny for service.  The 520 is an AT&T phone, and the 521 is the T-Mobile version, but this doesn’t really matter if you’re only going to use Wi-Fi.  You can get the completely unlocked Windows Phone 8X by HTC, but you’ll have to pay 500 bucks to do it.

When you get your phone, don’t install the SIM card.  During setup, carefully avoid or cancel any questions that involve activating service.  You should have no problem getting on Wi-Fi, and the phone’s Start screen will simply show “No SIM” in the Phone tile.

Installing the SDK and registering your phone for development is no different than if you had service on your phone.  When deploying your app from Visual Studio, just make sure it is configured to deploy to “Device” and not an emulator.

Micro-ORMs for .NET Compared – Part 3

This is the final part of a 3-part series comparing micro-ORMs.  We’ve already seen Dapper and Massive.  Now it’s time for PetaPoco.

PetaPoco

Website: http://www.toptensoftware.com/petapoco/
Code: https://github.com/toptensoftware/petapoco
NuGet: http://nuget.org/packages/PetaPoco

Databases supported: SQL Server, SQL Server CE, Oracle, PostgreSQL, MySQL
Size: 2330 lines of code

Description

PetaPoco was, like the website states, “inspired by Rob Conery’s Massive project but for use with non-dynamic POCO objects.”  A couple of the more notable features include T4 templates to automatically generate POCO classes, and a low-friction SQL builder class

Installation

There are two packages available to install: Core Only and Core + T4 Templates.  I chose the one with templates, which raises a dialog with the following message:

“Running this text template can potentially harm your computer.  Do not run it if you obtained it from an untrusted source.”

PetaPoco has a click-to-accept Apache License.  If your project is a console application, you’ll need to add an App.config file.

Usage

Because PetaPoco uses POCOs, it looks more like Dapper than Massive at first glance:

class Product
{
    public int ProductId { get; set; }
    public string ProductName { get; set; }
}

class Program
{
    private static void Main(string[] args)
    {
        var db = new Database("northwind");
        var products = db.Query("SELECT * FROM Products");
    }
}

There is also experimental support for “dynamic” queries if you need them:

var products = db.Query("SELECT * FROM Products");

PetaPoco has a lot of cool features, including paged fetches (a wheel I’ve reinvented far too many times):

var pagedResult = db.Page(sql: "SELECT * FROM Products",
    page: 2, itemsPerPage: 20);

foreach (var product in pagedResult.Items)
{
    Console.WriteLine("{0} - {1}", product.ProductId,
        product.ProductName);
}

While POCOs give you the benefit of static typing, and System.Dynamic frees you from the burden of defining all your objects by hand, templates attempt to give you the best of both worlds.

The first thing you have to do the use templates is ensure that your connection string has a provider name.  Otherwise the code generator will fail.  Then you must configure the Database.tt file.  I changed the following lines:

ConnectionStringName = "northwind";  // Uses last connection string in config if not specified
Namespace = "Northwind";

When you save it, you might get a security warning because Visual Studio is about to generate code from the template.  You can dismiss the warning if you haven’t already.

Now you can use the generated POCOs in your code:

var products = Northwind.Product.Query("SELECT * FROM Products");

First Impressions

PetaPoco is surprisingly full-featured for a micro-ORM while maintaining a light feel and small code size.  There is too much to show in a single blog post, so you should check out the PetaPoco website for a full description of what this tool is capable of.

Final Comparison

All of these micro-ORMs fill a similar need, which is to replace a full-featured ORM with something smaller, simpler, and potentially faster.  That said, each one has its own strengths and weaknesses.  Here are my recommendations based on my own limited testing.

You should consider… If you’re looking for…
Dapper Performance, proven stability
Massive Tiny size, flexibility
PetaPoco POCOs without the pain, more features

Micro-ORMs for .NET Compared – Part 2

This is Part 2 of a 3-part series.  Last time we took a look at Dapper.  This time we’ll see what Massive has to offer.

Massive

Website: http://blog.wekeroad.com/helpy-stuff/and-i-shall-call-it-massive
Code: https://github.com/robconery/massive
NuGet: http://www.nuget.org/packages/Massive

Databases supported: SQL Server, Oracle, PostgreSQL, SQLite
Size: 673 lines of code

Description

Massive was created by Rob Conery.  It relies heavily on the dynamic features of C# 4 and makes extensive use of the ExpandoObject.  It has no dependencies besides what’s in the GAC.

Installation

Unlike Dapper and PetaPoco, Massive does not show up in a normal NuGet search.  You’ll have to go to the Package Manager Console and type “Install-Package Massive -Version 1.1” to install it.  If your solution has multiple projects, make sure you select the correct default project first.

If your project is a console application, you’ll need to add a reference to System.Configuration.

Usage

Despite its name, Massive is tiny.  Weighing in at under 700 lines of code, it is the smallest micro-ORM I tested.  Because it uses dynamics and creates a connection itself, you can get up and running with very little code indeed:

class Products : DynamicModel
{
    public Products() : base("northwind", primaryKeyField: "ProductID") { }
}

class Program
{
    private static void Main(string[] args)
    {
        var tbl = new Products();
        var products = tbl.All();
    }
}

It’s great not having to worry about setting up POCO properties by hand, and depending on your application, this could save you some work when your database schema changes.

However, the fact that this tool relies on System.Dynamic is also its biggest weakness.  You can’t use Visual Studio’s Intellisense to discover properties on returned results, and if you mistype the name of a property, you won’t know it until runtime.  Like most things in life, there are tradeoffs.  If you’re terrified of “scary hippy code”, then this could be a problem.

First Impressions

Massive is very compact and extremely flexible as a result of the design choice to use dynamics.  If you’re willing to code without the Intellisense safety net and can live without static typing, it’s a great way to keep your data mapping simple.

Continue to Part 3…

Micro-ORMs for .NET Compared – Part 1

Recently, I have been made aware of a lightweight alternative to full-blown ORMs like NHibernate and Entity Framework.  They’re called micro-ORMs, and I decided to test-drive a few of the more popular ones to see how they compare.

Each of the tools listed here are small and contained within a single file (hence the “micro” part of the name).  If you’re adventurous, it’s worth having a look at the code since they use some interesting and powerful techniques to implement their mapping, such as Reflection.Emit, C# 4 dynamic features, and T4 templates.

The Software

Dapper

Website: http://code.google.com/p/dapper-dot-net/
GitHub: https://github.com/SamSaffron/dapper-dot-net
NuGet: http://nuget.org/packages/Dapper

Databases supported: Any database with an ADO.NET provider
Size: 2345 lines of code

Description

Dapper was written by Sam Saffron and Marc Gravell and is used by the popular programmer site Stack Overflow.  It’s designed with an emphasis on performance, and even uses Reflection.Emit to generate code on-the-fly internally.  The Dapper website has metrics to show its performance relative to other ORMs.

Among Dapper’s features are list support, buffered and unbuffered readers, multi mapping, and multiple result sets.

Installation

In Visual Studio, use Manage NuGet Packages, search for “Dapper”, and click Install.  Couldn’t be easier.

Usage

Here we select all rows from a Products table and return a collection of Product objects:

class Product
{
    public int ProductId { get; set; }
    public string ProductName { get; set; }
}

class Program
{
    private static void Main(string[] args)
    {
        using (var conn = new SqlConnection("Data Source=.\\SQLEXPRESS;
            Initial Catalog=Northwind;Integrated Security=SSPI;"))
        {
            conn.Open();
            var products = conn.Query<Product>("SELECT * FROM Products");
        }
    }
}

As you can see from the example, Dapper expects an open connection, so you have to set that up yourself.  It’s also picky about data types when mapping to a strongly typed list.  For example, if you try to map a 16-bit database column to a 32-bit int property you’ll get a column parsing error.  Mapping is case-insensitive, and you can map to objects that have missing or extra properties compared with the columns you are mapping from.

Dapper can output a collection of dynamic objects if you use Query() instead of Query<T>():

    var shippers = conn.Query("SELECT * FROM Shippers");

This saves you the tedium of defining objects just for mapping.

Dapper supports parameterized queries where the parameters are passed in as anonymous classes:

    var customers =
        conn.Query("SELECT * FROM Customers WHERE Country = @Country
            AND ContactTitle = @ContactTitle",
        new { Country = "Canada", ContactTitle = "Marketing Assistant" });

The multi mapping feature is handy and lets you map one row to multiple objects:

class Order
{
    public int OrderId { get; set; }
    public string CustomerId { get; set; }
    public Customer Customer { get; set; }
    public DateTime OrderDate { get; set; }
}

class Customer
{
    public string CustomerId { get; set; }
    public string City { get; set; }
}

...

var sql =
    @"SELECT * FROM
        Orders o
        INNER JOIN Customers c
            ON c.CustomerID = o.CustomerID
    WHERE
        c.ContactName = 'Bernardo Batista'";

var orders = conn.Query<order, customer,="" order="">(sql,
    (order, customer) => { order.Customer = customer; return order; },
    splitOn: "CustomerID");

var firstOrder = orders.First();

Console.WriteLine("Order date: {0}", firstOrder.OrderDate.ToShortDateString());

Console.WriteLine("Customer city: {0}", firstOrder.Customer.City);

Here, the Customer property of the Order class does not correspond to a database column.  Instead, it will be populated with customer data that was joined to the order in the query.

Make sure to join tables in the right order or you may not get back the results you expect.

First Impressions

Dapper is slightly larger than some other micro-ORMs, but its focus on raw performance means that it excels in that area.  It is flexible and works with POCOs or dynamic objects, and its use on the Stack Overflow website suggests that it is stable and well-tested.

Continue to Part 2…

Ignoring ReSharper Code Issues in Your New ASP.NET MVC 3 Application

ReSharper is a great tool for identifying problems with your code.  Simply right-click on any project in the Solution Explorer and select Find Code Issues.  After ReSharper analyzes all the files, you’ll see a window with several categories of issues including “Common Practices and Code Improvements”, “Constraint Violations”, and “Potential Code Quality Issues”.

Unfortunately, when you create a new ASP.NET MVC 3 application in Visual Studio 2010, Resharper will find thousands of code issues before you even start coding.

2019 issues found

Most of these “issues” are in jQuery and Microsoft’s AJAX libraries, and your average developer is not going to go around adding semicolons all day when they have real work to do.  So we need to tell ReSharper to ignore these known issues somehow.

It would be nice if ReSharper allowed you to ignore files using file masks, but it doesn’t.  You must specify each file or folder individually.  Go to ReSharper->Options…->Code Inspection->Settings.  Click Edit Items to Skip.

My first instinct was to lasso or shift-click to select all the jQuery scripts, but this is not allowed!  I certainly wasn’t going to bounce back and forth between dialog windows a dozen times just to add each file.

Luckily this is ReSharper, and we can move all the script files into another directory and update references automatically.  Select all the jQuery scripts in the Scripts folder simultaneously, right-click, and go to Refactor->Move.  Create a new jquery folder under Scripts and click Next.

Move to Folder

Now you can go back into the ReSharper options and add this folder to the list of items to skip.

Skip jQuery folder

Move Microsoft’s script files into their own folder, and tell ReSharper to ignore these as well.  I’m also using modernizr so I exluded the two modernizr scripts individually.

Skip Files and Folders

Find Code Issues again and things should look much better.  I’ve only got 25 issues now.

25 code issues

With the help of ReSharper’s refactoring capabilities I was able to get this down to one issue in just a few minutes.  Now you can get on with your project without having to mentally filter out a bunch of noise in the Inspection Results window.

Happy coding!

An HTML5 Music Visualizer for Dev:Unplugged

HTML5 is Here

Although HTML5 is still in development, the latest generation of popular browsers (those released within the past month or so) support a surprisingly consistent set of HTML5 features.  This allows developers to start seriously targeting the future standard and taking advantage of its many benefits.

The Contest

Microsoft is currently running a contest called {Dev:Unplugged} that gives Web developers the opportunity to showcase their HTML5 skills.  Entrants have the option of creating a game or music-related site, and compete for some awesome prizes.  On May 9, an expert team of judges will start evaluating entries based on several criteria such as creativity, quality, and fit with the contest theme.

My Entry: html5beats.com

What it is

html5beats is a music visualizer that generates real time animations that respond to the beat of the music.  In the past you had to use Flash or embedded media players to accomplish this.  With HTML5 you can do it with JavaScript and markup alone.

How it works

To synchronize audio and video, you must have access to the raw audio data.  Unfortunately, browsers don’t offer provide this access in a consistent way (and some don’t offer it at all).  I wrote a small C# program that preprocesses the sound files and I add the output (RMS amplitude) to a JavaScript file.  It doesn’t need to be high resolution (8 bit, 40Hz) so it works out to only about 20KB per song.  At first I thought I invented this method, but I Googled around and discovered that someone else beat me to it.  Nevertheless, it works well in practice and provides interesting results.

Features

Cross-browser compatibility

The following browsers are officially supported:

You can try other browsers with varying results.  Some will trigger a compatibility message, while others (like Firefox 3.6) will mostly work, but the site won’t look as good.

Full screen mode

The HTML5 canvas does not explicitly support full screen.  I solve this problem by using a second canvas that fills the entire page.  The image from the smaller main canvas is copied to the larger one every frame.  This may sound inefficient, but it performs well in all my tests.

Lyrics

This feature displays lyrics as the song plays, and can be turned on or off.  Although the canvas supports text directly, drawing straight to the canvas would interfere with some of the inter-frame effects I’m using.  Therefore, I position a div element over the canvas and change its inner text dynamically.

Pinned site features

Internet Explorer 9 offers a great new feature called “pinned sites” that provide Windows 7 desktop integration.  I’ve taken advantage of several pinned site features that enhance the user experience under IE9.

Feature detection and discoverability

Pinned site prompt

If you’re browsing with IE9, html5beats will detect it and prompt you to try pinning the site.  If you don’t like seeing this prompt you can close it.  Pinning your site adds a high-quality icon to the taskbar and gives you access to additional functionality.

Jump List

Jump List

Right-clicking the taskbar icon shows a Jump List with tasks that can take you directly to a specific page within the site, even if the browser isn’t currently open.

Thumbnail Toolbar

Thumbnail toolbar

This is one of the coolest aspects of pinning the site.  Hovering over the taskbar reveals playback buttons so you can play, pause, and navigate songs even when the browser doesn’t have focus.

Update: Previous Track and Next Track buttons have been added for additional control of the player.

CSS3

Until now, effects like rounded corners, shadows, and translucency were only available through browser-specific features, custom images, and elaborate CSS trickery.  CSS3 makes those techniques obsolete.  html5beats exploits CSS3 to improve the aesthetics of the main UI.

Using the Code

For now I’m disallowing use of the code, mainly to prevent someone from using it in a competing entry.  After the contest ends I plan on cleaning it up a bit and releasing it under an open-source license.

Please Consider Supporting the Site with Your Vote

In addition to a earning a high score from the judges, winning requires votes from the community.  If you like my entry, please vote for it today… there’s only one week left!  Also, look forward to new features and updates in the coming days – this is the home stretch.

Profiling Built-In JavaScript Functions with Firebug

Firebug is a Web development tool for Firefox.  Among other things, it lets you profile your JavaScript code to find performance bottlenecks.

To get started, simply go to the Firebug Web site, install the plugin, load a page in Firefox and activate Firebug.  Click the Profile button under the Console tab once to start profiling, and again to stop it.  Firebug will display a list of functions, the number of times they were called, and the time spent in each one.

For example, here is a page that repeatedly draws a red rectangle and blue circle on the new HTML5 canvas:

<!DOCTYPE html>
<html>
    <head>
        <meta charset="utf-8" />
        <title>Profiling Example</title>
    </head>
    <body onload="drawShapes();">
        <canvas id="canvasElement" width="200" height="200">
            Your browser does not support the HTML5 Canvas.
        </canvas>
        <script>
            function drawShapes() {
                var canvasElement = document.getElementById('canvasElement');
                var context = canvasElement.getContext('2d');

                context.fillStyle = 'rgb(255, 0, 0)';

                // Draw a red rectangle many times.
                for (var i = 0; i < 1000; i++)
                {
                    context.fillRect(30, 30, 50, 50);
                }

                context.fillStyle = 'rgb(0, 0, 255)';

                // Draw a blue circle many times.
                for (var i = 0; i < 1000; i++)
                {
                    context.beginPath();
                    context.arc(70, 70, 15, Math.PI * 2, 0, true);
                    context.closePath();
                    context.fill();
                }
            }
        </script>
    </body>
</html>

Let’s assume your code is taking a long time to execute.  Running the profile produces these results:

Profiling Results 1

Click to enlarge

This isn’t very useful because only user-defined functions show up.  There is only one significant function here so there’s nothing to compare.  If there were some way to profile built-in JavaScript functions, we might get a better idea of which parts of the code are running slowly.

Note: This is a contrived example written to illustrate a point.  It would be just effective, and probably a better design overall, to extract two methods named drawRectangle() and drawCircle().  See Extract Method.

As a workaround, you could wrap some of the native functions and call the wrappers in your program code, like this:

function drawShapes() {
    var canvasElement = document.getElementById('canvasElement');
    var context = canvasElement.getContext('2d');

    context.fillStyle = 'rgb(255, 0, 0)';

    // Draw a red rectangle many times.
    for (var i = 0; i < 1000; i++)
    {
        fillRect(context, 30, 30, 50, 50);
    }

   context.fillStyle = 'rgb(0, 0, 255)';

    // Draw a blue circle many times.
    for (var i = 0; i < 1000; i++)
    {
        context.beginPath();
        context.arc(70, 70, 15, Math.PI * 2, 0, true);
        context.closePath();
        fill(context);
    }
}

function fillRect(context, x, y, w, h) {
    context.fillRect(30, 30, 50, 50);
}

function fill(context) {
    context.fill();
}

But that would impact your design and create unnecessary overhead.  Ideally, you’ll want a solution that’s only active during debugging and doesn’t affect your production script.  One way to do this is to write overrides for the native functions and store them in their own .js file (don’t forget to reference the script file in the HTML page):

if (window.console.firebug !== undefined)
{
    var p = CanvasRenderingContext2D.prototype;

    p._fillRect = p.fillRect;
    p.fillRect = function (x, y, w, h) { this._fillRect(x, y, w, h) };

    p._fill = p.fill;
    p.fill = function () { this._fill() };
}

What we’re doing here is saving the original function by assigning it to another function with the same name, prefixed with an underscore.  Then we’re writing over the original with our own function that does nothing but wrap the old one.  This is enough to make it appear in the Firebug profiling results.

Profiling Results 2

Click to enlarge

The beauty of this approach is that it only runs when the Firebug console is turned on.  When it’s not, the conditional check fails and the code block is not executed.  The check also fails in other browsers such as IE9 and Chrome 11 beta, which is exactly what we want.

One disadvantage is that you have to write a separate function for each native function you want to override.  In the above example, a significant amount of time is probably spent in context.arc(), but we didn’t override it so there’s no way to tell.  It may be possible to override and wrap every function in a specified object automatically, but I haven’t tried that yet.  For now, I’ll leave it as an exercise for the reader.

Workaround for NullReferenceException in DBComparer

DBComparer 3.0 is a great tool if you want to synchronize your SQL Server database environments and don’t have hundreds of dollars to spend on Red Gate’s SQL Compare.  It’s simple to use and free.

http://dbcomparer.com/

I used it for a couple of weeks without any problem until one day when I tried to compare with a particular server and it crashed:

DBComparer NullReferenceException

Looking at the error message, we can deduce that the WriteRecentList() function saves the names of the servers you have typed in the recent servers list.  This is sort of like the recent files list found in some applications.

SettingsBase is part of the .NET Framework, and this part of the code is probably used to persist application settings.  A little digging around on the MSDN library reveals this:

Specific user data is stored in a file named user.config, stored under the user’s home directory. If roaming profiles are enabled, two versions of the user configuration file could exist. In such a case, the entries in the roaming version take precedence over duplicated entries in the local user configuration file.

A look in the user.config file confirms our theory that this is where the list of recent servers are stored.  However, DBComparer is only designed to support 10 recent server names (5 on each side of the comparison).  Any more than that and it blows up.

<?xml version="1.0" encoding="utf-8"?>
<configuration>
  <userSettings>
    <DBCompare.Properties.Settings>
      <setting name="RecentServerName11" serializeAs="String">
        <value>server1</value>
      </setting>
      <setting name="RecentServerName12" serializeAs="String">
        <value>server2</value>
      </setting>
      <setting name="RecentServerName13" serializeAs="String">
        <value>server3</value>
      </setting>
      <setting name="RecentServerName14" serializeAs="String">
        <value>server4</value>
      </setting>
      <setting name="RecentServerName15" serializeAs="String">
        <value>server5</value>
      </setting>
      <setting name="RecentServerName21" serializeAs="String">
        <value>server6</value>
      </setting>
      <setting name="RecentServerName22" serializeAs="String">
        <value>server7</value>
      </setting>
      <setting name="RecentServerName23" serializeAs="String">
        <value>server8</value>
      </setting>
      <setting name="RecentServerName24" serializeAs="String">
         <value>server9</value>
      </setting>
      <setting name="RecentServerName25" serializeAs="String">
        <value>server10</value>
      </setting>
    </DBCompare.Properties.Settings>
  </userSettings>
</configuration>

As a workaround until this bug is fixed, you can delete some or all of the server names in the value tags to make room for more and prevent the error.

Same Markup, Same Browser, Different Results

Here is a simple HTML5 page I created:

<!DOCTYPE html>
<html lang="en">
<head>
 <meta charset="utf-8" />
 <title>Inconsistent Rendering</title>
</head>
<body>
 <span style="height: 1px; overflow: visible; background-color: Green;">
 <h1>This is a test.</h1>
 </span>
</body>
</html>

And here is that page rendered in Internet Explorer 8.  In one case it’s being served on my laptop, and the other on the local intranet.

Test file on localhost

Test file on intranet

The markup, the Web servers, and the browser are all set up 100% the same.  These should be identical, shouldn’t they?  What’s the problem?  The answer is Compatibility View.

Compatibility View is a “feature” of Internet Explorer that causes it to ignore modern Web standards.  That would be fine, except Compatibility View settings vary by zone, and different zones have different defaults.  Also, the Compatibility View button in the address bar isn’t always available, which means there’s no visual indicator as to what mode you’re in, and you can’t change modes for the current page easily.

Compatibility View settings can be accessed in the Tools –> Compatibility View Settings menu, but if you’ve developed a standards compliant intranet site, you need a better fix than simply asking each individual user to reconfigure their browser.  Group Policy is one option, but there is a much simpler solution.  Just add the following meta tag to your pages, and it will force IE 8 to use the !DOCTYPE declaration in the page to determine the rendering mode.

<meta http-equiv="X-UA-Compatible" content="IE=EmulateIE8" />

If you’re experiencing rendering inconsistencies with the same markup being served from different environments, have a look into Compatibility View. It just might be the culprit.

Mercurial HTTP Error 500: Access is denied on 00changelog.i

Today I created a new Mercurial repository on a Windows server.  I cloned it, made some changes, tried to push, and was greeted with this:

C:\myapplication>hg push
pushing to http://servername/myapplication
searching for changes
abort: HTTP Error 500: .hg\store0changelog.i: Access is denied

My user account had write permission to the myapplication folder on the server, and the odd thing is that I’ve created repositories there before and never had a problem pushing changes.  I compared 00changelog.i to the same file in another repository that was working.  Turns out I was using anonymous authentication and IUSR was missing write permission.  I gave full control to IUSR on hg\store folder and…

C:\myapplication>hg push
pushing to http://servername/myapplication
searching for changes
remote: adding changesets
remote: adding manifests
remote: adding file changes
remote: added 1 changesets with 114 changes to 114 files

Success!

If you’re having problems pushing to a central server with Mercurial, make sure the IIS anonymous authentication account (IUSR or IUSR_MachineName) you have write permission to the hg\store folder and subfolders in your repository.

Follow

Get every new post delivered to your Inbox.