Say When

When was the last time you – yes, you, specifically – thought, “I sure would enjoy [insert favorite publication] more if they inserted more advertisements.”

Thought so.

A friendly reminder that readers, viewers, fans, etc aren’t the people pushing and demanding advertisements.

“‘Deal with it’ doesn’t mean ‘Make better advertising’ or ‘Target your advertising more effectively’ or ‘Turn your users into marketers’ (which is Facebook’s latest idea). All that is just just more grasp.” – Doc Searls

Related:

“He knowingly took multiple ads for a movie that he he hasnt seen, but believes to be vile. What does that say about Mr OReilly?” – Mark Cuban

For Cuban to know O’Reilly dislikes the movie, O’Reilly would have had to mention it (not an ad?), Cuban buys some ad time to see how much (relevance?). Think O’Reilly’s fans would miss this conversation if it went away completely?

Rails Cheap, MySQL Expensive

This is an update to earlier post on Performance and MySQL Indexes.

While 10-12 seconds per feed is an improvement, it’s completely unacceptable when we’re talking 2,000+ feeds (full parse would take 5.5 hours) and slow down the system to unusable.

In an attempt to find the bottlenecks, I loaded up the query_analyzer plugin on development and opened up a terminal window to watch the processes in production via prstat -Z1.

Both showed me the database – not the Rails app itself – was the slow one2.

I was able to get the average per feed parse time under 5 seconds by doing two things:

  1. re-writing the straight find_by_sql queries to use JOIN ... ON ... rather than the clumsy ...WHERE a.id = b.a_id AND b.id = c.b_id AND etc.... Why not use Rails’ find helpers? I wasn’t able to track down how to :include multiple tables/models
  2. Compare the feed’s last-modified date against the last-checked date in the database prior to running it through the parser.

Both of them are ‘duuuuh’ improvements, easily cutting the CPU burden of my database process in a third, often down to barely noticeable.

UPDATE 2 Dec 2007
Continued refactoring now has the average feed parse time around 3 seconds. Yes, that means in some instances I’m seeing 5+ feeds / second.

1. This monitoring process is now my biggest burden. 🙂
2. I’m sure any reasonably experienced Rails developer says this throughout the day.

China Restates Earnings

“…with the corrected China [Purchasing Power Parity] statistics, the whole question is moot. China is just not that big now and will not get that big any time soon.” – Albert Kiedel of the Carnegie Endowment

Kiedel’s article put a number of things into perspective for me – namely why I hear ‘China, China, China’ everywhere I go (like I heard ‘India, India, India’ a couple years back) – China is trying damn hard to pull itself out of poverty.

With the restated numbers, more than 500 million Chinese1 live below the dollar-a-day poverty line.

Makes me want to give them a hand and outsource something to them.

Thanks to Angus @ Kids Prefer Cheese

1. 37% of China’s population. As a reference point, the entire US population is a hair over 300 million with 12% in poverty .

If We Can’t Share Nothing

Hi.

Howareyou?

Whatsup?

Nothing.

Yaknow.

PrettyCrappyOutToday

TrafficWasHorrible

How many times have you had that conversation?

Today?

Banality in less than 140 characters is the foundation of our social interaction. It safely builds the trust required for a longer, more engaging exchange.

If we can’t share the meaningless, what can we share?

“Part of the power of Twitter is that, among all of these social tools we use to communicate on the Web, this is the one that truly feels social…I truly feel as though I’ve established some sense of a relationship with certain people.” – Mike Keliher

Then Mike says nice things about me. Twitter-hug.

Leopard Installs Not Always Problem Free

After the customary until-you-cant-take-it-anymore 30-minute waiting period, I installed 10.5 Leopard on the Mac mini that serves my local network.

The mini isn’t my primary work machine, and it’s regular duties are fairly straight forward: transport backups, be invisible.

As such, the upgrade was quick and straight-forward (but I wasn’t really paying attention, I was working while all this was going on). After restarting the it was back. I poked around a little, started investigating CalendarServer and decided that the spit-and-polish on the UI would in fact make me more productive1 so I started the upgrade on my primary workstation.

This was a bad idea.

After the intial upgrade, one of the handful of third-party PreferencePanes or Startup Items sent the Finder into an infinite loop of crash and relaunch and crash and relaunch repeat until hard restart.

Two re-upgrades later, I determined it wasn’t bad installs, and firewire-moded into trash anything that may be interfering with the Startup process2.

While this process gave me Leopard, I’m still recovering days later. Not yet back to where I was in Tiger.

The archive+install ate my /local directory, so I needed to re-install svn and mysql (thankfully I left myself a reminder). The ruby mysql gem needed to be recompiled and I’v lost my VPN configuration. All of which were running just fine previously.

There are 3 more Macs in the house scheduled for upgrades. None of which I’m prepared to dedicate the same amount of time on as my primary machine. Yes, I’m gun shy.

1. There’s a Jobs’ Reality Distortion Field for you.
2. M-Audio, Wacom, and Tivo are all suspects.

The Performance Power of MySQL Index

I’m at the point in the development cycle where performance is the worth working on, for two reasons; everything else that’s going in is in, it’s too slow.

After one too many out-of-memory errors and far too many times wondering if I should have purchased a larger server to begin with, I sent an SOS message to McClain.

I’ll paraphrase his response, “You don’t have any indexes in your database. Get some.”

I took McClain’s advice and drew up a migration to add indexes for any table with a *_id in it, and a couple heavily-used non _id columns.
add_index :publishings, %w(item_id feed_id)
remove_index :publishings, %w(item_id feed_id)

The results in my feed parsing engine are stunning.
Before indexes: 50-60 seconds / feed
After indexes: 10-12 seconds / feed

Local-Ownership Arguments Loco

“If the small operations want to stay in business… they have to innovate.” – Wolfgang Puck

I’m with Puck, I don’t buy the argument that big national operations take money from smaller locally-owned operations. If anything, big name presence is proof a viable, stable market exists. Would it be better if Puck didn’t have a presence here and wasn’t pushing for locally grown food sources? No.

Should we all eat at Buffalo Wild Wings instead, simply because it’s headquartered here?

The premise that shopping at a national operation isn’t shopping local is bizarre for a number of reasons:

  1. The people working there live locally.
  2. Much of the business’s income needs to stay local to just to pay for continued presence (rent, marketing, salaries, utilities).
  3. And in the case of 20.21, the food itself is regional if not local.

Now, flip the argument around – Minnesota is home to many national brands (Target, 3M) – are we going to stop giving them money because they have locations elsewhere?

No, that makes no sense. In the same way the trade deficit doesn’t make sense.