Review – “Django Essentials”

This is a review of “Django Essentials” from Packt Publishing, but first some background:

Around 2007 I used to be fairly fluent in Django, this after having built a substantial Django web application from scratch. These were the days before Django 1.0. While all the necessary documentation was there, the framework had some rough edges and peculiar design choices.

Still, Django made a lot of sense. I had previously dabbled with Ruby on Rails but it never really clicked with me. Once I grew comfortable with the Django mindset I never looked back.

Time passes and I move on to other projects. I tried to keep track of what was happening in the world of Django, but lost interest around version 1.2. Fast forward to 2014, and I suddenly get the chance to develop with Django again. By keeping an ear on the tech grapevine I have a rough idea of the major innovations to Django since it dropped off my radar. Nonetheless, the devil is in the details and I dive into the Django documentation.

You can say a lot of good things about the Django docs. For starters, it is very comprehensive and quite well organized. Still, it is not a good choice for a learner or for someone who just need a quick refresher. The wealth of information is a good thing when you need a solution to a particular problem. It is not a good thing when you just want to figure out how it all the pieces fit together. There is the Django Book as well but it is terribly outdated.

Luckily, the good folks at Packt Publishing were kind enough to offer me a review copy of the recently released “Django Essentials” by Samuel Dauzon at just about this time.

Is is a nice, short read, kicking in at about 150 pages. If you skip implementing the code examples, it should not take you many hours to get through it. That being said, by working your way through the examples you will get a clear sense of how Django will let you make your web application come to life.

What I particularly like about this book is that it puts Django in its proper context. Best-practice tools and techniques such as Nginx, jQuery, AJAX, PostgreSQL and virtualenv are introduced, meaning that you will have an idea of how to make them work with your Django application.

All the major Django concepts are covered in a pedagogic manner. As an example, classic views are introduced first. Later on, class-based views are introduced, showing you how they can save you a lot of time—while reminding you that classic views still are useful in certain situations. A nice touch is that South, the (soon to be deprecated) Django data migration tool, is introduced before you write your first Model class.

The book is well written and nicely laid out. One minor annoyance is that some of the code examples could have been better formatted. Another nitpick is the UserProfile model example in the models chapter. For the novice Django developer it should be pointed out right away that the secure choice is to always use the builtin User class rather than trying to roll your own. Fortunately this is explained later in the book.

Given that the book spends a bit of time on the recommended ecosystem around Django, it would have been nice to see mentions of other tools such as Bootstrap and the various third-party amendments to Django which makes it a lot more powerful. Still, the brevity of this book is one of its strengths.

All in all, “Django Essentials” is a great read and comes highly recommended if you know a little bit of Python and want to level up to web application development.

AWS cuts data transfer rates: Pricing comparison update

by on February 2, 2010
in AWS, EC2

AWS just cut their outbound data transfer rates from $0.17 to $0.15 per GB/month up until 10TB. I have updated my previous comparison between Go Daddy and AWS with the latest numbers.

Updated AWS/Go Daddy dedicated server cost comparison

by on August 28, 2009
in AWS, EC2

UPDATE 1: Corrected the bandwidth calculation in the formulas for AWS.

UPDATE 2: Added new data for the February 2010 AWS data transfer price reduction.

In a previous posting I did a cost comparison of a reserved Amazon Web Services EC2 instance and a comparable dedicated server from Go Daddy. Amazon recently announced a set of price cuts for reserved instances, so an updated comparison is in order.

The server configurations I’m comparing are the same as last time:

Go Daddy AWS AWS (new)
Processor Core 2 Duo 2.66 GHz 4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each) [1 unit equals a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor] 4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each) [1 unit equals a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor]
Hard Drive(s) Dual 300GB drives 850GB of instance storage 850GB of instance storage
Memory 3.2GB 7.5GB 7.5GB
1-year plan, w/o bandwidth $2,483.46 $2,351.20 $1,962.40
3-year plan, w/o bandwidth $6,622.56 $5,153.60 $4,557.20

I have added an extra column for the new EC2 reserved instance pricing scheme. Notably, the prices for the Go Daddy options haven’t changed in the last six months. For the 1- and 3-year AWS plans, the total costs have dropped by $390 (17%) and $600 (12%), bandwidth excluded.

(For the full discussion, refer to the previous posting.)

When including bandwidth use, the updated table looks as follows:

1GB/mth 20GB/mth 100GB/mth 400GB/mth 800GB/mth
Go Daddy, 1-year plan $2,483.46 $2,483.46 $2,483.46 $2,483.46 $2,723.34
AWS, 1-year plan $2,353.24 $2,392.00 $2,555.20 $3,167.20 $3,983.20
AWS, 1-year plan (Aug 2009) $1,963.24 $2,002.00 $2,165.20 $2,777.20 $3,593.20
AWS, 1-year plan (Feb 2010) $1,963.00 $1,997.20 $2,141.20 $2,681.20 $3,401.20
Go Daddy, 3-year plan $6,622.56 $6,622.56 $6,622.56 $6,622.56 $7,342.20
AWS, 3-year plan $5,159.72 $5,276.00 $5,765.60 $7,601.60 $10,049.60
AWS, 3-year plan (Aug 2009) $4,559.72 $4,676.00 $5,165.60 $7,001.60 $9,449.60
AWS, 3-year plan (Feb 2010) $4,559.00 $4,661.60 $5,093.60 $6,713.60 $8,873.60

Updated comparison between AWS and Go Daddy pricing plans

With the old pricing, the AWS option was preferable unless bandwidth exceeded 100GB per month (for the 1-year plan) or 250GB per month (for the 3-year plan). After the August 2009 price cuts, AWS became an even more competitive option, although one that still falls behind for high bandwidth scenarios.

In February 2010, the price for outgoing data traffic dropped from $0.17 to $0.15. With the 3-year plan, AWS now matches Go Daddy up until almost 400GB per month.

Addendum: Some of the background data used in this posting:

  • Go Daddy quotes from August 28, 2009
  • Go Daddy 3-year plan cost: 2-year plan quote * 1.5
  • AWS 1-year plan cost (< Aug 2009): $1,300 + (24 * 365 * 1 * $0.12) + (GB/mth * $0.17 * 12)
  • AWS 1-year plan cost (Aug 2009): $910 + (24 * 365 * 1 * $0.12) + (GB/mth * $0.17 * 12)
  • AWS 1-year plan cost (Feb 2010): $910 + (24 * 365 * 1 * $0.12) + (GB/mth * $0.15 * 12)
  • AWS 3-year plan cost (< Aug 2009): $2,000 + (24 * 365 * 3 * $0.12) + (GB/mth * $0.17 * 36)
  • AWS 3-year plan cost (Aug 2009): $1,400 + (24 * 365 * 3 * $0.12) + (GB/mth * $0.17 * 36)
  • AWS 3-year plan cost (Feb 2010): $1,400 + (24 * 365 * 3 * $0.12) + (GB/mth * $0.15 * 36)
  • EC2 pricing information
  • Go Daddy dedicated server pricing information

How to compile SimpleParse 2.1.0a1 for Python 2.6 on Windows Vista

SimpleParse is a fast Python single-pass parser generator that I use regularly. When I finally made the move onto Python 2.6 it turned out that there is no pre-compiled package for 2.6 on Windows. So, here is my procedure for compiling the source package on Windows Vista.

1. Install Cygwin if you don’t already have it on your system, and make sure that the version of Python you are installing SimpleParse for is on either the system or the Cygwin path.

2. Download and install Microsoft Visual C++ 2008 Express Edition. You should ensure that you have the latest Vista service packs installed before attempting this. If the installer quits on you then just reboot the computer and try again. Without this installed, you wil get an ‘Unable to find vcvarsall.bat’ error.

3. Download and unpack the SimpleParse 2.1.0a1 source. Using the Cygwin shell, place yourself in the root source directory.

4. If we try to run python install at this point, the Visual C++ compiler will complain:

stt/TextTools/mxTextTools/mxTextTools.c(149) : error C2133:
'mxTextSearch_Methods' : unknown size
stt/TextTools/mxTextTools/mxTextTools.c(920) : error C2133:
'mxCharSet_Methods': unknown size
stt/TextTools/mxTextTools/mxTextTools.c(2103) : error C2133:
'mxTagTable_Methods' : unknown size
error: command '&amp;amp;quot;C:\Program Files\Microsoft Visual Studio 9.0\VC\BIN\cl.exe&amp;amp;quot;'
failed with exit status 2

We have to add the following lines to stt/TextTools/mxTextTools/mxTextTools.c, starting at line 148 (before staticforward is used for the first time):

#ifdef _MSC_VER
#define staticforward extern

5. with is a Python 2.6 keyword, meaning it can’t be used as a variable, as is the case in the SimpleParse source code. So, we have to replace it with something else:

$ sed -r 's/with/with_t/g' &amp;amp;lt; stt/TextTools/ &amp;amp;gt; tmp.txt
$ cp tmp.txt stt/TextTools/

6. Finally, run python install as usual.

On the sadness of nouns

“Writing, Jen thought, seemed like a very sad pursuit. Like painting, but worse. At least paintings had color. Writing, though, was just black marks on paper, standing in for people and objects and events that could never be seen or felt. It seemed pathetic in a way. Nouns were the saddest words of all, trying so hard to summon real objects to life.”

Jon Raymond, “Words and Things” (Livability)

When EndNote X2 fails

The connection between EndNote X2 and Microsoft Word 2007 seems to get corrupted on a regular basis on my Vista setup. Based on hours of web searching and trial and error, here is a short summary of ways of getting it working again. Use these when you get error messages such as ‘server threw an exception’, ‘server execution failed’, and ‘invalid class string’.

In prioritized order:

  • Run EndNote as an administrator (for Windows Vista).
  • Reset EndNote defaults (“Edit -> Preferences -> EndNote defaults”). This seems to work most of the time. Make sure to close Word first. After having reset EndNote, close it, and then try launching it from Word.
  • The library may be corrupted. Try running “Tools -> Recover Library”.
  • If all else fails, reinstall EndNote

There are other possible problems, especially when upgrading from older versions, but these actions usually work for me.

(For Norwegian readers: If you are using the Norwegian version of EndNote, the error messages will be ‘ugyldig klassestreng’ or ‘serverutføringen mislyktes’.)

Why Amazon Web Services just became a competitive web hosting provider

by on March 12, 2009
in AWS, EC2

UPDATE 1: There is now an updated version of this posting. The new version incorporates the August 2009 AWS reserved instance pricing changes.

UPDATE 2: Corrected the bandwidth calculation in the formulas for AWS.

Amazon Web Services just announced a new reserved instances pricing plan. In short, this plan allows you to reserve EC2 instances for a 1 to 3 year period by paying a one-time reservation fee. The hourly rate for reserved instances is considerably lower than for regular spot-market instances. For comparisons sake, a large standard on-demand instance will set you back $0.40 per hour, while the large standard reserved instance is only $0.12 per hour.

With the old pricing scheme, hosting a web service on AWS instead of on a dedicated server was not a very cost-competitive option, at least not for resource-intensive applications. For my web site, Eventseer, I require at least a large standard instance—at $0.40 per hour for 24/7 operation (bandwidth costs not included), this turned out way too expensive compared with offerings from traditional dedicated server providers.

To see if the new pricing scheme fares any better, I have compared the cost of an AWS EC2 reserved large instance with a similar dedicated server from Go Daddy:

Go Daddy AWS
Processor Core 2 Duo 2.66 GHz 4 EC2 Compute Units (2 virtual cores with 2 EC2 Compute Units each) [1 unit equals a 1.0-1.2 GHz 2007 Opteron or 2007 Xeon processor]
Hard Drive(s) Dual 300GB drives 850GB of instance storage
Memory 3.2GB 7.5GB
1-year plan, w/o bandwidth $2,483.46 $2,351.20
3-year plan, w/o bandwidth $6,622.56 $5,153.60

I’m not sure how the 4 EC2 compute units compete with a dedicated Core 2 Duo 2.66 GHz. This probably also depends on the nature of your application. Note that the AWS solution has twice the amount of memory. From what I can see, you can not get a Go Daddy dedicated server with more than 3.2GB of memory, while AWS offers up to 15GB on the extra large instances.

When disregarding bandwidth costs, AWS suddenly makes a lot of sense. As bandwidth use is highly application-dependent, let’s consider a few different bandwidth use scenarios:

1GB/mth 20GB/mth 100GB/mth 400GB/mth 800GB/mth
Go Daddy, 1-year plan $2,483.46 $2,483.46 $2,483.46 $2,483.46 $2,723.34
AWS, 1-year plan $2,353.24 $2,392.00 $2,555.20 $3,167.20 $3,983.20
Go Daddy, 3-year plan $6,622.56 $6,622.56 $6,622.56 $6,622.56 $7,342.20
AWS, 3-year plan $5,159.72 $5,276.00 $5,765.60 $7,601.60 $10,049.60

Comparison between AWS and Go Daddy pricing plans

Conclusion: With a 1-year plan, AWS is the cheapest option until you reach about 100GB of external bandwidth per month. With the 3-year plan, the AWS bandwidth cost isn’t a problem until about 250GB per month. (Bandwidth is “free” with Go Daddy dedicated servers up until 500GB per month; after that it’s an extra $19.99 per month until you reach 1,000GB).

Considering that the AWS solution gets you twice the amount of RAM, AWS suddenly seems a very viable option even for web service hosting—as long as you’re not expecting extreme
amounts of traffic. However, once you get popular the outgoing data transfer pricing will take its toll.

Addendum: Some of the background data used in this posting:

  • Go Daddy quotes from March 12, 2009
  • Formula for Go Daddy 3-year plan cost: 2-year plan quote * 1.5
  • Formula for AWS 1-year plan cost: $1,300 + (24 * 365 * 1 * $0.12) + (GB/mth * $0.17 * 12)
  • Formula for AWS 3-year plan cost: $2,000 + (24 * 365 * 3 * $0.12) + (GB/mth * $0.17 * 36)
  • EC2 pricing information
  • Go Daddy dedicated server pricing information

Running pytst 1.15 on a 64-bit platform

by on January 25, 2009
in Python, pytst

Update: The latest version, 1.17, compiles on 64-bit platforms out of the box, so the patch below is no longer necessary.

Nicolas Lehuen’s pytst is a C++ ternary search tree implementation with a Python interface. It’s an excellent tool—and it is also really, really fast.

Unfortunately version 1.15 doesn’t compile on 64-bit platforms, giving the following error messages:

pythonTST.h:178: error: cannot convert 'int*' to 'Py_ssize_t*' for argument '3'
to 'int PyString_AsStringAndSize(PyObject*, char**, Py_ssize_t*)'
tst_wrap.cxx: In function 'PyObject* _wrap__TST_walk__SWIG_1(PyObject*, int, PyO
tst_wrap.cxx:3175: error: cannot convert 'int*' to 'Py_ssize_t*' for argument '3
' to 'int PyString_AsStringAndSize(PyObject*, char**, Py_ssize_t*)'
tst_wrap.cxx: In function 'PyObject* _wrap__TST_close_match(PyObject*, PyObject*
tst_wrap.cxx:3250: error: cannot convert 'int*' to 'Py_ssize_t*' for argument '3
' to 'int PyString_AsStringAndSize(PyObject*, char**, Py_ssize_t*)'
tst_wrap.cxx: In function 'PyObject* _wrap__TST_prefix_match(PyObject*, PyObject
[...and so on...]

Until Nicolas releases an updated version, here is the quick fix:

cp pythonTST.h pythonTST.h.orig
cp tst_wrap.cxx tst_wrap.cxx.orig
sed -r 's/int size/Py_ssize_t size/' < tst_wrap.cxx.orig > tst_wrap.cxx
sed -r 's/int length/Py_ssize_t length/' < pythonTST.h.orig > tmpfile
sed -r 's/sizeof\(int\)/sizeof(long)/' < tmpfile > pythonTST.h

Run these commands from the pytst source directory and you should be all set. I’m not sure if this a fully satisfactory solution, but at least this will get the test suite running again.

Dostoevsky on the dangers of science

“He was devoured by the deepest and most insatiable passion, which absorbs a man’s whole life and does not, for beings like Ordynov, provide any niche in the domain of practical daily activity. This passion was science. Meanwhile it was consuming his youth, marring his rest at night with its slow, intoxicating poison, robbing him of wholesome food and of fresh air which never penetrated to his stifling corner. Yet, intoxicated by his passion, Ordynov refused to notice it. He was young and, so far, asked for nothing more. His passion made him a babe as regards external existence and totally incapable of forcing other people to stand aside when needful to make some sort of place for himself among them. Some clever people’s science is a capital in their hands; for Ordynov it was a weapon turned against himself.”

From “The Landlady” (1848)

3 lessons the individual investor can learn from JPMorgan Chase

by on September 29, 2008
in Finance, Investment

JPMorgan Chase recently picked up the remains of troubled Washington Mutual for a mere $1.9 billion—a deal described by some as buying the company “for nothing.”

Although the takeover carries a fair amount of risk for JPMorgan, the bank looks increasingly likely to emerge as a credit crisis survivor. They still have a “reasonably strong” balance sheet and have by and large managed to steer clear of a debacle of historical proportions.

How did this come about? Also, are there elements to this story that you as an individual investor can learn from?

Fortune recently published a comprehensive account of how JPMorgan mostly avoided the subprime debacle (“Jamie Dimon’s Swat Team”), which is well worth a read. Based on the article, I have tried to summarize the main principles that have so far kept JPMorgan as a company out of trouble.

These principles are, in my opinion, just as valid on a personal level when you, as an individual investor, are to decide whether or not a company stock is a good buy. So, I have also tried to relate each principle to general best practices of investment.

1. It’s all in the numbers

In early 2006, JPMorgan were, like everyone else, dealing in subprime CDOs. By the end of that year, the bank had dumped more or less all of their subprime mortgage holdings. What happened?

First of all, the numbers were no longer looking good.

JPMorgan has a strong tradition of data-mining every aspect of their business and continuously trying to figure out the story behind the numbers. What Jamie Dimon, the CEO, and his team saw was that the subprime market was way to risky for the profits it was generating. Data from their retail banking division showed that subprime loan payments were increasingly late. Moreover, their own data analysis indicated that the supposedly safe AAA ratings lavished upon CDO bonds were bogus.

The numbers were increasingly and consistently negative and in sharp contrast to the conventional wisdom on the subprime market. Trusting the data and its interpretation rather than the general opinion, JPMorgan left the market altogether.

Learning points:

When evaluating a company as an investment opportunity, you can rely on a barrage of opinion from a sea of sources with a multitude of motivations.

Or, you can go straight to the facts.

The numbers in quarterly reports or annual accounts don’t lie unless deliberately tampered with. If the balance sheet tells you that a company is heading for trouble, then that company is heading for trouble, no matter what anyone else might be saying.

  • Learn how to read and understand the balance sheet, the income statement and the cash flow statement. Once understood, they tell you more about the company than any financial advisor or industry analyst ever will.
  • Be diligent in your pursuit of data, both on the company, the sector, and the general state of the economy.
  • Read the numbers first and then make up your own interpretation. Other people’s interpretations are not gospel, but rather a challenge to your own interpretation.
  • The opinionated parts of a company’s annual report are mostly fluff and should be read as such. Read the annual report from the back. It’s not a crime to talk about a company in optimistic terms; manipulating the numbers is.

2. Investment is not about short-term profit

JPMorgan exited the subprime market while it was still a booming business. This took a lot of guts when other Wall Street firms were making a killing from subprime.

In the short term they lost ground to competitors by not jumping on the latest Street bandwagon. Their conservative stance and the effect it had on quarterly earnings must have generated immense pressure, both internally and externally. From 2005 to 2007, JPMorgan fell from third to sixth place in fixed-income underwriting. This is the sort of development that causes ruckus in board meetings.

Nonetheless: In the long term they prevailed.

Their decision to trust their analysis of subprime being too risky turned out to be a sensible one, even if this meant a very negative short-term impact on their balance sheets.

By focusing on core company values rather than pursuing immediate profit, JPMorgan emerged on top.

Learning points:

You can certainly make money from overhyped stocks whose valuation belies the true worth of their business. This, however, requires you to play the game of getting out before the bubble of irrational exuberance pops.

Timing the market is ultimately about luck. Luck is a property that is best reserved for the lottery rather than your savings.

Fashion does not imply quality. Just because everyone else is ecstatic about something—be it dot-com companies or the mullet—does not mean you should be as well. Only get with the crowd if there is a fundamentally sane reason for doing so. Dare to be different.

  • Be prepared to stick it out as long as the underlying fundamentals of your analysis does not change. Quality always prevails in the long term.
  • Even fundamentally good stocks go down if they are unpopular. This is the way the market rolls; don’t lose any sleep over it.
  • You will not be able to consistently time the ups and downs of the market, so don’t even try to. Learn to live with the fact that good stocks will sometimes go down for no good reason.
  • Don’t check your portfolio every five minutes. Apart from keeping you from getting any other work done, it will only lead you to perceive the market as more volatile than it really is. Think the market is too volatile? Just reduce your sampling frequency. Remember that you are in it for the long run.
  • Listen to other people but don’t let them make your decisions. Even if they have a compelling chain of arguments there are likely more conclusions that can be drawn from the same set of underlying facts. Your explanation of why to invest should always be your own.

3. Question and diversify

JPMorgan operating-committee meetings are described as “loud and unsubtle”. According to Bill Daley, head of corporate responsibility and former Secretary of Commerce, “[p]eople were challenging Jamie, debating him, telling him he was wrong. It was like nothing I’d seen in a Bill Clinton cabinet meeting, or anything I’d ever seen in business.”

This culture of allowing, encouraging and listening to dissent ultimately made it easier for JPMorgan to make the right decisions. Getting all facts and viewpoints on the table while continously questioning what they were doing was a major success factor.

Still, JPMorgan made their own share of mistakes.

In 2007 a short-term secured loans unit bought a $2 billion subprime CDO—upper management claims they never knew. Other billion-dollar write-offs had to be endured as well. Their principle of only taking risks when you are paid well for doing so is anything but perfect. It also remains to be seen how well timed their shotgun purchase of Washington Mutual turns out—among its assets are an estimated $30 billion’s worth of loans that have to be written down. However, on the whole they look to be emerging from the credit crisis as a much healthier company than their surviving competitors.

Learning points:

There is no such thing as a risk-free investment, so be prepared to accept losses. JPMorgan’s competitors put themselves in a position where some of them could not weather a downturn in one of their business segments. JPMorgan, on the other hand, were doing what they could to make sure their good moves outweighed their bad moves.

Making bold investment choices always carries the probability of failure. Use diversification as a cushion for when failure strikes.

In bicycle racing, one does not talk in terms of if a rider will take a tumble but rather about when. The same should apply to your investments.

Moreover, be prepared to continuously question your own judgment. The premises for earlier decisions will change, so be prepared to revert on them. If faith becomes a stronger motivation than reason for holding on to stock it’s probably time to let go.

  • Hedge your investments. No matter the soundness of your strategy or how diligently you stick to your principles, things will still go wrong from time to time. Don’t allow mishaps to take you down.
  • Be prepared to change your position. Things change, the world keeps turning, and so should you.
  • Don’t get emotional about a stock. Your favorite company might turn from making mostly good decisions to making mostly bad decisions. These things happen, so be prepared to get out even if this means taking a loss.
  • If your sole reason for hanging on to a stock is the belief of future recovery then you have already lost. Get rid of it, count your losses, and learn from the experience.

Next Page »