Common problems with appCache

A client of mine need a large part of their existing website to work offline; the web site is really an app that is used to enter a fair amount of information though various forms. The site is designed to be used on mobile devices and in the outdoors, where the network connectivity is no the best, so the site already used localStorage to store user input when the server isn’t accessible. I tried to get it to work with appCache, but got more problems than I bargained for. In the end I decided that it just wasn’t work the trouble; it would impossible to keep all the offline data up-to-date without a huge amount of network traffic.

Here are some of the problems I encountered.

A url is either always cached, or never cached

This is absolutely the biggest problem when using appCache. You can tell the browser to cache a url, but after that the browser will always uses the cached version of the file, even though you are on-line! For static sites this isn’t a problem, but with dynamic sites that have pages with changing content, this is just horrible.

Using Cache-control headers? Doesn’t help. The browser caches the page no matter what.

So what’s the solution? The only solution I can think of is to serve the same pages from two different urls; one url is used for normal access, and the other url is cached by the browser so it can be accessed offline. So I access this page:

And I tell the browser to cache this page:

That way the user always gets a fresh copy of the page when on-line. But this brings us to the next problem: how do we make sure the browser has the latest version of each page in its cache?

The cache is only updated when the cache.manifest changes

The browser will never automatically check if a page in its cache should be updated, unless the contents of the cache.manifest has changed. Using Cache-Control or Expires headers doesn’t help. All you can do is modify the cache.manifest (for example by updating a timestamp in it) whenever you know/think a page might have been updated.

Also, you cannot tell the browser that a single file has changed; the browser will always reload all files listed in the cache.manifest. That’s a bit of a problem if you want to have a large number of pages to be available offline.

Cross-domain fallbacks are tricky

Since none of the pages on my regular site are cached, I need to tell the browser to redirect to the offline site whenever the regular site is inaccessible. This is called a fallback.

That sounds easy, but it isn’t. I tried to redirect to my offline site if a page on the regular site cannot be accessed (and is not cached) by putting this cache manifest on my regular site:


That should tell the browser to cache the url, and display it whenever accessing any page the regular site is not possible. Unfortunately, that just doesn’t work. Chrome refuses to redirect, instead it just shows the standard “Unable to connect to the Internet” message.

OK, next I tried to outsmart Chrome by placing a /offline url on my regular site, and instructed the server to do a 301 redirect to the offline site when someone tries to access the /offline url.

/ /offline

But nope, that doesn’t work either. You are not allowed to do cross domain redirects in the fallback pages. It actually says so in the spec. What does work, is doing the redirect in javascript. I simply put this content in the /offline file:

$(document).ready(function (){
    window.location = ‘;

No changes are needed to the cache manifest; the browser automatically caches the urls mentioned in the FALLBACK section.

Yay, I thought, I’m slowly getting somewhere with this.

Each site has its own appCache

Yes, each site has its own appCache, and it can contain files from other sites. Read that again. It means the browser doesn’t have one big bucket where it places all files from all cache.manifests. Instead, each site has its own bucket, where all files from that site’s cache.manifest are placed. Files from different domains can be placed in the same appCache; for example if your site requires jQuery, you can put in the cache for your site. But that url is only cached for your site. If some other site also requires jQuery, it must place that url in its own cache.manifest. Sounds complicated? Below is an example.

I tried putting a cache.manifest with the following content on

/ /offline

You would think that it caches the front page of the offline site (, and you could then access that page when offline, right? Not quite. It does cache that page, but places it in the cache for That means the browser can only access the offline site when it’s referenced from the main site (i.e. when a page on the main site loads an image, script, css etc. from the offline site).

When you go to the browser checks whether it has an appCache for that site, and since it doesn’t, it just says “Unable to connect to the Internet”, even though that exact url is in the appCache (for a different site).

In Chrome, you can write chrome://appcache-internals/ in the address bar to see what appCaches you have, and what files each of them contains.

So called ‘master’ files are automatically cached

Any page that contains a <html manifest=”/cache.manifest”> tag is called a master file, and is automatically added to the appCache. Apparently you cannot prevent this. Why is this a problem? Well, when the master file goes into the appCache, it will never again be automatically updated from the server (until you modify the cache.manifest).

A common solution is to put the manifest reference in a hidden iframe; that way the iframe is the master and will be cached, but the page containing the iframe will not be.


I came to the conclusion that the users would have to download several megabytes of data to their appCache to be also to use the site offline; the site just wasn’t designed with appCache in mind from the start. I scrapped the whole appCache idea for this site. I hope you have better luck with it!

Compare (Diff) branches in Tortoise Git, or how to preview changes before doing a merge

I’ve recently switched over from SVN to Git in my version control. I’m a Windows user who’s used TortoiseSVN, so I was glad when I found out about TortoiseGit, which presents a very similar user interface to good ol’ TortoiseSVN. Still, there was lots to learn, as Git has a fundamentally different logic in storing the repository, both locally and remotely. Committing changes is a two step process unlike SVN’s single step.

Anyway, when doing software development in a project with several developers developing many braches, it’s sometimes nice to be able to take ‘sneak peek’ into the future and see what will happen when the branches are finally merged to one.

So how do you answer the question “what changes would be made if I merged these two branches right now?” Will there be any conflicts, for example? You could of course pull the changes from one branch to the other, but that is not always desirable.  There is a way to compare branches in TortoiseGit, but it’s faily well hidden by default.

Here’s how you can preview the changes before actually making them:

  • Shift-“right click” on your git folder
  • Select TortoiseGit >> Browse Reference

  • Select two branches from the list (hold shift to select more than one)
  • Right click and select “Compare selected refs”

This opens a window that displays the diff between the branches. Note that the diff is displayed as if the bottom branch (“Version 2”) was merged into the top branch (“Version 1”). There’s an arrow button at the top that lets you change the direction of comparison.

How to make <pre> tags wrap

Especially when using a responsive site design, you will probably encounter a need to display some text usign a fixed-width font but with word wrap so you don’t get excessively long lines on small displays. Here’s the way to do it:

pre {
    display: inline-block;
    white-space: pre-wrap;

How to generate Entities from an Existing Database in Symfony 2.1

I was trying to generate Symfony 2.1 entities from an existing MySQL database according to the instructions in the current documentation. I was supposed to be done with three simple commands, but it turned out to be somewhat more difficult than that.

Fix your config

First of all: this converter doesn’t support some basic MySQL data types by default, so you have to edit your config.yml file as follows:

#Doctrine Configuration
            enum: string
            bit: integer


Fix your database

Next, make sure all your database tables have primary keys; OK, that might sound obvious, but I had some odd old tables (which I wasn’t going to use in this project) lying around in the database. Apparently there is no way to specify that to the converter that you want to use only some tables.

Run the commands


First of all, the documentation tells you to run the following command:

php app/console doctrine:mapping:convert yml ./src/Miro/KPBundle/Resources/config/doctrine/metadata/orm --from-database --force

They forget to update the documentation when the directory structure has changed at some stage (2.0?), so the target folder should be ./src/Miro/KPBundle/Resources/config/doctrine. Secondly, you really need to tell the command what namespace these entities will be using, otherwise you will have to edit the yaml files yourself to add it. Therefore, the correct command in my case is:

php app/console doctrine:mapping:convert yml ./src/Miro/KPBundle/Resources/config/doctrine --from-database --force --namespace="Miro\KPBundle\Entity\\"

I figured out the correct command after some trial and error. Note the double “\\” at the end, that’s in Windows. If you’re using Linux, you probably need to add some more backslashes.

The command create yaml files which are otherwise perfect, except that they are named like this:



The second command to run was:

php app/console doctrine:mapping:import MiroKPBundle annotation

It’s supposed to create entity classes for your database tables, and I suppose it does do that;  it just doesn’t care about the .yml files created by the previous command. So now I have entities for all tables in my database. No problem, I just delete the ones I don’t want.

Since I didn’t create foreign keys in the tables (as MySQL doesn’t use them anyway), I had to edit the .yml files to tell Symfony/Doctrine how the tables/entities should be related to eachother. Not a problem, except that I didn’t seem to find any documentation for the file format, only some examples of what they should look like.


Once I’d figured out the correct syntax, it was time to run the final command:

php app/console doctrine:generate:entities MiroKPBundle

This gives the not so informative error message:

Generating entity "Miro\KPBundle\Entity\BkAccount"


Invalid mapping file 'Miro.KPBundle.Entity.Miro.KPBundle.Entity.BkAccount.orm.yml' for class 'Miro\KPBundle\Entity\BkAccount'.

doctrine:generate:entities [--path="..."] [--no-backup] name

So there is a problem in the file somewhere. And not just in one file, they all gave the same error. Was I supposed to rename the files like Miro.KPBundle.Entity.Miro.KPBundle.Entity.BkAccount.orm.yml’? That didn’t seem right, especially since it wasn’t exactly a ‘file not found’ error. Anyway, by trial-and-error I figured out the correct file name was simply ‘BkAccount.orm.yml’.

The command is incorrectly named IMO, since you already created the entities with the previous command (if you told it to create annotations like I did). All this command does is add getters and setters for the table columns.