Really Why Django and Rails CMS Are So Rare

A little bit ago, this article – Why Django and Rails CMS Are So Rare – came across my radar. Being that I pay a  lot of attention to content management systems, I was intrigued. After reading it, I immediately started formulating a comment, only to decide I have way too much to say on this topic. For those of you looking for a higher ed tie in, there isn’t one. This is purely industry rebuttal here (with notes of ranting for good measure).

Let’s start things off by running straight through some of the commentary in the original post, because to start with, I think there are some issues there to address. For instance:

Put another way, Rails or Django shops are much more likely to roll their own CMS than ever use a “boxed” CMS. Why is this? I’m convinced it’s because for many developers, a CMS is seen as little more than a bootstrapper.

This is partially true. I’ve said in the past that I think the best CMSs are the ones that provide amazing frameworks for creating sites, making as few assumptions about the presentational and relational components as possible. It’s one reason I still love working with dotCMS. But to say Rails and Django shops make their own CMSs because of this fact is completely disconnected. If that were true, you would expect devs of any languages to roll their own as a preference. He defends this point by discussing native functionality:

In this sense, Rails and Django provide many of the core building blocks of a CMS, they’re just waiting for someone to bolt them together.  A Rail/Django project is a bit of a shapeless mass at first, but if you tell it to be a CMS, it would re-arrange itself pretty easily to turn itself into one.

You mean, like pretty much any software project? Every programming language provides different tooling, IDEs, capabilities, etc to accomplish tasks. ANY software project can be described as “a shapeless mass at first, but if you tell it to be a CMS, it would re-arrange itself pretty easily to turn itself into one.” The idea that this is somehow unique to Rails/Django, and also a strength, is a complete fabrication. I hear this all the time whenever people assert Language X is better than Language Y because of Reason Z. You know what’s even more accurate? That when you commit yourself to a language/platform/system, and learn to use it right, it will do exactly what you want it to do, very efficiently, and in a way other platforms can’t. More on this in a moment though.

Shops who use Rails or Django are much [sic] likely to simply develop code libraries to provide the functionality necessary to a CMS and them [sic] drop those in on a per-project basis.  This code may never see the outside of their organization, and it certainly never gets marketed as an installable, usable thing.

Yep. You know where else this happens a lot? Java shops. And PHP shops. And .Net shops. But it hardly has anything to do with why you don’t see more CMSs in the wild for those languages. This just in, good dev shops develop code and pattern libraries which they reuse for internal purposes, story at eleven! I can’t think of one single, solitary reason Rails and Django shops would be “much more” likely to do this than other places, except besides the weird belief that they are just magically better programmers who all have good habits.

Developers were struggling with basic MVC tasks in PHP for almost a decade before Rails came along in 2003 and raised the bar for everyone.  Remember, Drupal and Joomla have been around since 2001 (Joomla was Mambo back then); WordPress since just before Rails came out in 2003…

…Do we owe the existence of Drupal to the crappiness of PHP, especially the early versions of PHP?   Same on the ASP.Net side –  if ASP.Net 1.1 was better, would we even have Ektron… At the end of the day, were CMS developed to paper over shortcomings in various programming environments?

I’m chaining some comments together there that I felt needed juxtaposition with each other. Here, he’s basically asserting that these CMSs were needed to overcome the shortcomings of their respective languages (with the corollary that obviously Rails and Django don’t have these issues). Of course, he himself points out how long PHP had been out prior to Rails (and prior to a time when most websites used CMSs at all. Heck, in 2001 I was using a crazy Perl/CGI based system called Coranto – based on Newspro. I’m amazed that it’s still around). The fact that PHP still absolutely dominates the landscape is a testament to just how well it’s been developed, not how “crappy” it is. The idea that CMSs are hiding “shortcomings” of the languages is an absurd line at best, and reinforces some weird notion that apparently those environments exist for the sole reason of enabling the presentational layer of the web. I don’t even know what to make of that. After all, PHP, Java, .Net, Perl, Python… all of these exist to do far more than just be a conduit for a CMS.

The existence and competence of Rails and Django have prevented a serious, shared CMS ecosystem from developing around either Ruby or Python.  Since their respective frameworks provide so much functionality out-of-the-box, developers in those languages have never had to generate critical mass around a CMS.

So, good, capable CMS platforms don’t exist for these because they’re already so perfect? I guess that makes sense if the only people ever using the systems are developers. But most folks have to work in environments that require substantially more flexibility and user-facing tooling. You know what, I’ve used CMSs created solely by developers. They’re terrible (Drupal, Typo3). Powerful, yes, but terrible for end users. As a general rule of thumb, developers sort of suck at UI design. Software of any kind exists to enable those that aren’t creators. It doesn’t matter if it’s a CMS or a word processor. Anyone who works in the real world knows that only a small fraction of a percent of people out there really know how the computers they use work. The rest need software to hold their hand as much as possible. The language or framework behind it? Utterly irrelevant to them.

Conversely, has the advent of competent frameworks for all languages slowed new CMS development?  Think about it – what CMS has been released recently that has made a big splash on the industry?  Silverstripe has been the new hotness in the LAMP world for a couple of years, and I hear about Concrete5 a little.  My own favorite open-source newcomer – ProcessWire – is still way under the radar.  The fact is, new entrants are more and more rare.

No. The advent of competent frameworks hasn’t slowed new CMS development. Actually, it’s had the opposite effect. There are new CMSs dropping practically every day (and dying just as quickly), you just never actually hear about them unless you’re listening. The issue is one of market saturation. Ten years ago, hardly anyone used CMSs, so there was a huge amount of space for systems to come in and snatch up land. That’s no longer true. Do you think ProcessWire is going to be the new hotness that unseats WordPress? No, absolutely not. Not because it’s not necessarily a better system, it just doesn’t have the muscle. It’s the same reason why PHP isn’t going to yield the internet back end to RoR. You can’t make a splash when the pool’s already full of people. Anyone who’s ever done sales knows that it’s far harder to get a client to switch a product, than to buy their initial one, even when they hate their existing one.

Rails and Django have been fantastic frameworks right from the start.

Really? Twitter might have something to say about that. Remember prior to 2008, when Twitter was supposed to be the crown jewel of what you could do with RoR? And they couldn’t stay online. They couldn’t scale. And they ultimately abandoned it. I’m not saying RoR is bad, but beware people that drink the Kool-Aid. And Django? How about the fact that every release seems to see memory consumption increase while performance decreases? Point being, perspective is everything. Every system has strengths and weaknesses, and no system is right for everything. Part of being good at software is understanding how to select the right tool for the job, and knowing when that right tool might not be in your tool chest.

The Real Reason

Okay, let’s wrap this up with the real reason you don’t see as many RoR or Django CMSs. The reason is grade-school simple. This is OP’s reason:

In short, why are Rails and Django CMS so rare?   Because those guys have never really needed one.

You know, every time I hear this kind of line about a programming language, any language, I cringe. It just sounds elitist, and it’s not even remotely a real reason. Do you want to know why? Here’s why.

Server Side Languages Market

Server Side Languages Market

A lot of people can build their own cars from scratch. A lot of people can build their own houses from scratch. A lot of people can grow their own food from scratch. A lot of people can make their own clothing from scratch. A lot of people are also smart enough to know that while fun, and novel, most of these things are very rarely worth all that effort unless you just really feel good about yourself doing it. And even then, it’s not really a big deal to build your own house if you know how, when it’s a normal house. But the Ruby guys aren’t building normal houses. They’re building those esoteric, partial underground houses made from tires and beer bottles and then bragging about how green they are. They can be entirely right about that, and have a cool house that is every bit as efficient and unique as they say. They’re still in a small subset of an already small community who’s main limiting factor is that they’ve chosen to be weird. When the Cold Fusion guys can look over at you and giggle… dude… that says something.

My day-to-day CMS is a Java based platform. Java is one of the more battle-hardened of the programming languages we have at our disposal. And yet every day I lament the fact that it’s crazy hard to find good Java devs to help us out. But I know in the end the reason it’s so hard to find them is because the community is so small. It’s a kiddie pool compared to PHP. These guys don’t need to go out and make CMSs, because they make crazy good money just being awesome at what they do for whomever they work for. Yet, we have it easy compared to RoR. But it has nothing to do with platform/developer quality, and everything to do with marketshare.

And that, ladies and gentlemen, is Why Django and Rails CMS Are So Rare.

Keeping Your Kitchen in Order

I know you don’t want to admit it, but we all know that you love Kitchen Nightmares as much as I do. I actually enjoy cooking shows of all sorts and like applying their lessons to my somewhat limited selection of Kansas delicacies, which are generally limited to cuts of select beef in different shapes. Seriously, ask about my steak sushi sometime. Nevermind. My point is that while watching Kitchen Nightmares today, I had a thought about the web development trade and just how much it parallels the cooking world. I want to share these thoughts.

The People

Allow me to borrow some definitions from Wikipedia, the clear authority on all things culinary.

Head Chef

This person is in charge of all things related to the kitchen, which usually includes menu creation, management of kitchen staff, ordering and purchasing of inventory, and plating design. Chef de cuisine is the traditional French term from which the English word chef is derived. Head chef is often used to designate someone with the same duties as an executive chef, but there is usually someone in charge of them, possibly making the larger executive decisions such as direction of menu, final authority in staff management decisions, etc. This is often the case for chefs with several restaurants.

In many restaurants, like many web shops, this is where the story can begin and end – for better or worse. Sometimes you’re just stuck flipping the burgers on the grill by yourself. Good head and executive chefs are not only great cooks, they generally have a keen understanding of the business world as well, because at the end of the day, if the financials aren’t right then the restaurant won’t make money and then people get fired. Likewise, a good web lead will understand the business they work in, and how it ties in to things like marketing, customer service, etc. If a person is just a really good cook, that doesn’t make them a chef, and if a person is good at writing HTML, it doesn’t make them necessarily right to lead a web office. In the end, we can also distinguish between head and executive chefs in the web world. It’s similar to comparing a CWO to a Director of Web Services. Most organizations lack that level of granularity though, and like with restaurants, the duties and responsibilities will be ran from a single touch point that could be described as either or both simultaneously.

Sous Chef

The Sous-Chef de Cuisine (under-chef of the kitchen) is the second in command and direct assistant of the Chef. This person may be responsible for scheduling and substituting when the Chef is off-duty and will also fill in for or assist the Chef de Partie (line cook) when needed. This person is responsible for inventory, cleanliness of the kitchen, organization and constant training of all employees. The “Sous-Chef” is responsible for taking commands from the Chef and following through with them. The “Sous-Chef” is responsible for line checks and rotation of all product. Smaller operations may not have a sous-chef, while larger operations may have several.

When thinking sous chef, think art or creative directors. You can also put product and project managers in this category too. These people should be fully capable of doing any of the tasks that they oversee in a pinch, but more likely they’re somewhat more managerial in nature in the office. They communicate and coordinate. They’re the ones who will go to the project meetings so the web version of the chef de parties don’t have to.

Chef de partie

A chef de partie, also known as a “station chef” or “line cook”, is in charge of a particular area of production. In large kitchens, each station chef might have several cooks and/or assistants. In most kitchens, however, the station chef is the only worker in that department. Line cooks are often divided into a hierarchy of their own, starting with “first cook”, then “second cook”, and so on as needed.

Our chef de parties are the role-specific people. Our front end developers, our interaction designers, our graphic designers. Each has their station “on the line,” and best serves their team when they are working properly in concert with the others. Over time, they’ll usually pick up the skills from other stations to augment theirs, allowing them to move in and out of spots as necessary to help out. This gives them flexibility, a broader skillset, and an understanding of team building. It’s okay if they can’t do every station well, as long as they understand how each station gets them from prep to table.

Commis

A commis is a basic chef in larger kitchens who works under a chef de partie to learn the station’s responsibilities and operation. This may be a chef who has recently completed formal culinary training or is still undergoing training.

Interns. Nuff said.

What I think is the most important part of the striation of responsibilities in the kitchen is the principle behind earning your stripes. In a serious kitchen, you don’t just come in and become sous chef without the requisite experience. No, you start as the commis, and you work your way up. By the time you’re a sous chef, or ready to become the head chef in your kitchen or somewhere else, you know and understand the roles under you because you’ve been there. It helps you respect and understand those that work for you. Folks who have worked with me in the 24 Hour Plays have heard me give a very similar speech as it relates to theatre – that I find it very important that if you want to direct, you should spend some time as an actor, and a technician, etc. Be the ball.

Skills

Ask any ten chefs what skills are most important to being great at the craft, and you’re likely to get a broad spread of answers. But like with any trade, there are a few very atomic skills that are fairly consistently important. For instance…

Palate

A chef with a bad palate is like a web developer that uses Frontpage. Having a well trained palate is imperative to the process of selecting and procuring quality ingredients for meals. We aren’t buying beautiful rockfish or delicious cuts of beef, but we are selecting CSS frameworks and jQuery plugins. We have to have a good sense of color and design. We need to know if WordPress is the best platform for a new site, or Drupal. Our ability to “taste” the environment we’re developing and pick the right components to combine into it can make or break a project. You know you’ve developed a good palette when you can look at a website and tell what CMS it uses. Likewise if you can make the CMS you use disappear entirely to the user. They know it’s there, but you know how to keep it all perfectly balanced. This also gives you the instinct to tell when something is going wrong. When ingredients have “soured,” or you’re using too much of something, or not enough of something else.

Mise en place/Hygiene

Mise en place is a French phrase which means “everything in place.” It’s used to refer to having all your ingredients and tools ready and where they belong. I’m grouping this with hygiene because good mise en place skills inherently reinforce kitchen cleanliness and maintenance as well (I’m not just talking about washing your hands after peeing). Web mise en place would be everything from making sure your workplace ergonomics are well planned, to making sure that you lay out your software and tools in a way that makes sense and encourages good development. Just because our “countertop” is digital, doesn’t mean you shouldn’t think about how you have windows open and arranged, for instance. Web hygiene means coding clean – maintaining comments, keeping HTML semantic, and not leaving old code out to rot on the countertop, so-to-speak. A sloppily coded and presented website can be a turn off to visitors. Good web hygiene is a trait of good attention to core user experience on your site.

Knife Skills

This is a pretty direct metaphor. The chef’s knife is an extension of their hand, arm, and body – as is our Wacom stylus, magic touchpad, or mouse. It’s all about core skill competency and practice. A good chef can dice an onion blindfolded in a tenth of the time you or I would take. You should be able to spin up a Git repo, fork, and commit just as effortlessly. Mysqldump and LESS are your bitch. You can recite CSS selectors and attributes like a multiplication table. The knife needs to be sharp, and the knife is an extension of you. Those skills make your job easier, and they show to the visitor. The competency and effort that shows through is something that users will see and appreciate. Being adept at your skills won’t always change the “taste” of your end product, but the craftsmanship is something that will show through and visitors will respect and peers will admire.

Improvisation

When all a chef’s skills meet in the center, it makes for a magical sixth sense. Put a good chef in a room with some ingredients and give them a couple hours, and more than likely they will magically appear at the end with a beautiful dish. That’s the entire practical value in so many of the cooking competition shows – can the cook work under pressure with unknown variables? Knowing how flavors mix, understanding cooking principles, and solid plating techniques are all integral to producing fantastic dishes in a pinch, even if they don’t know what they are necessarily walking into. Because noweb developer has ever had to build something on the fly at the last minute, right? This is also how we grow as designers and developers. Anyone can follow a recipe – they’re just a Google away. But it’s what you can do with that recipe to make the end product your own that will really set you apart. If all web development was about was following a recipe, we would have been out of a job long ago, because solutions would all be simple cut and paste jobs a monkey can do. There are times when you can even get away with that, and certainly we all have. But those are times where you’ll never stand apart, and never produce anything unique.

Oh, and then there’s…

…that important fact about the sheer way work gets done. If you get nothing else from my rambling above, take this away with you. You know one of the common, recurring themes on Kitchen Nightmares regarding why restaurants are failing? Menus and procedures that are forced upon the chef without regard or respect for the chef’s role and abilities. People playing in kitchens that don’t respect ingredients. The head chef is rarely the top person in the restaurant – someone else probably owns it – but successful restaurants know how to put the chef in charge of the menu and allows them to run the kitchen their way to coordinate the production and distribution of food, taking orders from the wait staff and cycling out the finished product.  A president, dean, or director of marketing should be able to trust in their head web person implicitly – and it’s that inability to trust that I see repeated over and over as a core complaint from our peers at other institutions.

It’s that faith and commitment to excellence that makes the difference between a truly successful kitchen and restaurant, a perfectly mediocre one, and one that ends up with its doors closed.


Photo Credit: cc icon attribution small Keeping Your Kitchen in Ordercc icon sharealike small Keeping Your Kitchen in Order Some rights reserved by Edsel L

OTC Goes Bold With Redesign

I want to extend a sincere congratulations to the folks at Ozarks Technical Community College on their redesign. It is probably one of the single most brave things I’ve seen a college do with their homepage in quite some time, for better or worse. And that’s good, because that’s how everyone learns. Someone has to take a chance once in a while. What especially caught my attention though was that they basically did something I never really thought would be possible. Back in 2009, I wrote a bit on the principles of IA in large sites like a university. Several conversations ultimately were spun off that article, one of which involved talking about the idea of driving a university site’s navigation entirely through search.

Back then, it was little more than a pipe dream though. Random musings about a “what if” scenario. There’s so much to consider for it – and I’m not even talking about things like the political side of university sites – that as neat as the idea seemed, I never thought it could be done. And while I applaud OTC’s attempt, I still think the approach is not really ready – though it could be with just a little more work. Here’s why.

Majors Search Results

Majors Search Results

Probably the most important thing is SEO. If you are going to lean so heavily on search, that means your site – all of it – needs to have pristine SEO so that everything can be found and located properly. We’re talking meta data, keyword density, link text, the whole shebang. OTC is using a Google Appliance of some kind, which can afford you a lot of power (sadly, Google discontinued the Mini this year, leaving only the more expensive GSA on the product line). You can see some of that power in action if you do a search for “programs.” Note at the top, you specifically get the keymatch that they manually entered to make sure that a search for “programs” always results in the right page first thing. That’s good. Now do a search for “majors.” No keymatch this time referring the visitor to the programs page. The top matches aren’t relevant at all, as a matter of fact. That’s not to say the results are consistently bad, but in this approach, there’s just so little room for error.

PSU’s unified search

PSU’s unified search

Another pain point for me here is the use of the stock results page as well. It’s bland, uninteresting, and doesn’t invite the user to explore the results. They have added additional search options above the box, but they aren’t integrated at all – each is a different landing page that isn’t necessarily search related. Lastly, they don’t seem to be taking advantage of collections, which can make a GSA or Mini so powerful in getting users into the right “bucket” of information. Collections are a way of filtering content into logical categories of some kind. For instance, you could have a “News” collection that keeps all the press releases searchable and separate from the normal search. At PSU, their search is an example of both unified search and collections (seen to the right). Things like “athletics” and “classes” are collections, while “people directory” is actually a separate system. But it all works through the single interface (though the people results do go to a different results page, so it’s not entirely unified).

Something else, and this is specific to the GSAs still, is that they don’t appear to be using OneBox modules either. That’s the perfect way, for instance, to try and pull in some of those external searches from the result page header, like departmental and contact searching. For instance, do a search for “NUR 230,” a nursing program course. Using a OneBox module, they could instantly provide course information, schedules, associated books, teachers, etc. If you want more examples, the OneBox is what gives you instant results in Google when you do things like typing in a FedEx tracking number, looking for movie times, checking the weather, and so on. That’s the trick here. If you’re going to go all in, blow it out of the water. Universities have TONS of structured data that could be presented this way, to fantastic results. Won’t someone think of the user’s clicky finger?

From an article at the Community College Times:

OTC Chancellor Hal Higdon said a review of the college’s website using Google Analytics showed that more than 80 percent of site visitors find what they look for through an outside search tool or OTC’s Google search server. Often, visitors skip the front page and go directly to the search box to quickly find the information they need.

Google Analytics Site Search Usage report

Google Analytics Site Search Usage report

Admittedly, I know nothing about just what went into this research (and if anyone at OTC reads this, I live in Pittsburg, KS, about an hour and a half from you – let’s talk), but I would caution any school interested in this that analytics alone will absolutely not give you the full picture here. It can give you a lot of information, to be sure, but context and intent are intimately important to this particular endeavor. For instance, it’s easy to say that people may search a lot on a site because the the navigation or IA sucks – something analytics alone won’t tell you. So it would seem reasonable that going all search would avoid that problem, since search is designed to do an end-around on such things (this is, of course, assuming you aren’t considering things like nav and IA in your search logic). But maybe they search simply because your content sucks, and they’re trying to find something more informative. That’s a content problem. My point is, know your problems and know your goals. Have a plan for each, isolate your success metrics, and have a maintenance and measurement scheme ready.

And there certainly may be something to catering to users that search. A quick look deeper into the Google Analytics report sampled above (you do look at your search reports, right?) revealed some extremely interesting metrics. For instance, the average user spend 4:33 minutes on the site, as opposed to 11:48 minutes for users that searched. Users that don’t search viewed on average 2.73 page compared to 8.28 for searching users. But, what the analytics here don’t tell me is why. But hopefully, if your numbers are similar, you would want to know the answer to see if there’s something valuable there to be leverage.

There’s something else that bugs me, though. While I don’t want to nitpick, I feel the need to point some of this out.

“Start Here” navigation

“Start Here” navigation

In trying to mimic Google, they also used a “services header” on the homepage. That’s fine, go for it. But, I gotta admit the logo really bugs me. It just looks stuck on and clip-arty. But more than that, I am really bugged by the “Start Here” link. First off, “Start Here” isn’t at all descriptive about what to expect when I click on it. And once I did, I was confused that I was looking at a page with a careers based URI, but the content seemed to be related to academic programs. That’s just a labeling thing, but it’s a pretty major one, since it’s first chair in what little navigation they have. They also added a “more” link. While I know this is in line with mimicking Google, it smells too much like rebranded quick links. As a user, if the goal is to have me search, why would I click the “more” link rather than just type in the keyword for what I want? From the very start, you’re already inviting me to break with your intended navigation scheme, and that’s a dangerous game.

At the end of the day, I still think there’s something to this. Every university struggles desperately with IA and navigation. Awesome, global search just seems natural. The barriers that will most commonly prevent success are technology that can’t deliver, and the politics of university web maintenance. If you’re considering it, keep this stuff in mind:

  • Hire a full time SEO person. Period. Don’t be cheap here.
  • Don’t abandon navigation all together. Consider your “services” that require fast access. This requires a shift in thinking, making your homepage that of a “service provider,” rather than whatever you are now.
  • Spend six months on taxonomy. Card sorting. User research. Whatever people call something, make sure those keywords are mapped and accounted for
  • Make use of autocomplete and dynamic results (again, both things Google does). Save your users as much time as you can, and help eliminate mistakes.
  • Utilize tools like OneBox or similar systems to provide enhanced result data for commonly accessed, structured data.
  • Make sure you have a reporting system on pages. A “Was this what you were looking for?” flag people can click that will report the page and search that sent them there.
  • Accept the fact that you may have to take away a lot of editing rights from people to prevent pollution of your results. Two words – Quality. Control.
  • You might consider splitting the site into a sort of “gated” and “ungated” area, where the gated area is vetted, approved, specific info. The ungated section is everything else that no student ever cares about.
  • Respect the results page and how important it is
  • Unify your search platforms
  • Measure and track everything. Can you tell me the most viewed, but unclicked autocomplete keywords? Most common misspellings? Keywords most likely to result in an application? Bounce rate after a search result? And these are just some of the easy ones.
  • Your search needs to be smarter than your users. It should know what they want, regardless of how they ask for it. It needs to deliver, accurately, without question. It needs to adapt incessantly.
  • Hire a full time SEO person. Period. Don’t be cheap here.
Edinboro’s keyword autocomplete

Edinboro’s keyword autocomplete

Oh, there’s one more important thing here. I don’t care if your homepage is a Google knockoff or not, you should care about search. Almost all of my bulletpoints above hold true no matter what your web strategy entails. Edinboro University is one I credit with putting a ton of work into mapping keywords for things on their site to an autocomplete feature for their search. Their keyword system is a completely secondary system too, it’s not in a GSA or anything like that, but they unified it properly so the user’s experience is seamless. All they know is they are getting good recommendations that can save them keystrokes. But otherwise, Edinboro’s search is implemented just like any normal search, nothing else special about it. But the details, the little things, that’s what can matter the most.

Good search is like a life preserver. It can save a visit. It doesn’t matter if it’s just a tool, or your entire navigation. Bad search frustrates users and drives them away, and I don’t know anyone in that business. At the end of the day, I have no doubt OTC will continue to improve, and for a community college I have a ton of respect for the effort they’ve put forth here. I’m damn interested to see how it evolves.

Are You Being Used?

Have you heard of Fiverr yet? Fiverr is a service that launched back in February of 2010 as a tool for people to sell simple goods and services for five bucks. Maybe that’s planting a tree in your honor in the rain forest, or sending a letter to a random soldier, or belching your name on video. Pretty much anything goes. It’s not a terrible idea, strictly speaking, and is a nice way for people to make a little extra money doing something they’re good at.

So, what does this have to do with higher ed, and why should you care? Well, simply, this.

Search Results on Fiverr for “edu”

Search Results on Fiverr for “edu”

It’s no secret that there are plenty of black hat SEO techniques for link farming. This is also far from the first time someone tried to leverage the .edu TLD for link relevancy (Note: it seems pi.edu has finally gone away, without much fanfare. No one misses it.). On top of it, odds are you can’t make Fiverr stop these listings. Because screw you that’s why. At least, I suspect that’d be the subtext of the answer you’d get from them.

How Does It Work?

Simple, spider services have created lists of things like blogs and wikis that have unmoderated change or comment systems. The people offering these services buy or pirate those lists. In some cases, they have tools that automatically submit to sites on the list. Then you watch the spam start coming in. Anyone that runs a WordPress site understands how much trouble spam can be. If you’ve ever wondered where it comes from and why, this is a pretty good start.  In the end, the provider or their software tries to pass as a legitimate commenter and includes a link in the post text or author site (if you include the author’s link on their name) which then shows up, they get paid, and you get polluted.

This is a much less offensive and less dangerous version of account hijacking that we’ve seen in the past, where faculty, staff, or student web space hosted by the university is taken over and used as a landing page host or to drive backlinks and keywords.

What Can You Do?

Shut. Down. Everything. Okay, not really. But seriously, do review your moderation and approval processes for your blogs and wikis. Anything someone can contribute to should be reviewed to make sure you haven’t created a target. Keep some of these in mind (adapt to your environment):

  1. Don’t ignore your sites and security settings.
  2. Try simple steps like requiring at least a first post to be approved before users are whitelisted.
  3. Look at third party commenting services like Disqus or Intense Debate which have tools for addressing this that are better than yours.
  4. Many CMS’s have plugins that can provide more robust comment protection. For instanceAkismet is common for WordPress. I’ve had success with Spam Free WordPress.
  5. Add moderation or extra steps to comments containing links.
  6. Make sure links in comments are set to come through with rel=”nofollow” enabled.
  7. Limit faculty and student abilities when it comes to setting up and configuring sites, blogs, wikis, etc.
  8. Allow visitors to vote down or mark comments as spam.
  9. Turn off commenting after a certain length of time or when a blog is discontinued but still available.
  10. Set up a routine to audit your sites for this kind of spam ever X months.

None of these suggestions will likely work on their own. Some may or may not work at all in some cases. There’s no real silver bullet to the problem, as long as humans are willing to do the work manually for companies for $5.00. But, you can at least try to minimize your risk of exposure by making the effort for the spammers cost more than the time it’s worth. When they get through anyway, if you’re monitoring properly you should be able to delete the comment and blacklist the user or IP quickly enough that it becomes apparent you aren’t a high value target. The bottom line is to be vigilant, active, and take responsibility for the sites and services you’re offering that could be targets for these types of tools. Fiverr is far from the only way to accomplish this (see?), but what really matters is preventing the end result.


Photo Credit: cc icon attribution small Are You Being Used? Some rights reserved by 666isMONEY ☮ ♥ & ☠

Rethinking the UX of the Program Listing

Take a moment and think about your listing of majors and minors. Really think about it. Is it good? Does it reflect how great your offerings are? Is it even accurate? Is it just a stupid, boring, damned list (if you’re interested in something a bit off the beaten path, check out RIT’s Pathfinder system or look at what the University of Arizona is doing)? If the answer is yes, I want to kick you an idea. Filtrify.

On its face, Filtrify is just another jQuery plugin that you can use for atomic control of a collection of DOM elements. Which is cool enough I suppose. But check out this example on their demo site. Now, instead of movies, imagine it’s student action photos from different programs, or some other visual representation of the program. Instead of genres and actors and directors as filters, you have schools and interests and jobs. It would leave you with an interactive program listing that invites a user in to play and explore. In this particular case, Filtrify is serving as an extension of the live filter design pattern – enabling a user to see all the available options, and then selectively removing that which isn’t relevant to them. People like toys, and they are inherently curious. Create an environment that promises an opportunity for exploration, and you’ll net some explorers.

But wait, it doesn’t have to be Filtrify per se, either – that’s just one idea. Something like filtering blocks would work just as well. As would something you come up with entirely on your own. The trick is, you need to start rethinking the UX of the program listing (and probably a lot of other stuff on your sites, too), and really consider how your tools may be impacting prospective students’ ability to see you as the right institution for them. Jakob Nielsen pointed out how bad lists could be nearly a decade ago (see #7), yet schools seem to be married to them for lack of the desire to construct a better way. People don’t find long, unfilterable lists to be user-friendly at all. We already know that 17% of students will drop a school from their list if they can’t find what they want on your site. Even more will mark a school down if they have a bad experience. What is that risk worth?

The underlying issue here is that schools need to start putting more effort into the next step of their web design processes, and start looking at the user experience of what they are making. It’s easy and fast to slap stuff together and move on, but there is enormous value in usability testing. It’s part of the overall process that is too frequently skipped, since a webpage published is frequently seen as “good enough.” While the old fashioned linked list may be functionally adequate for the data being displayed, it’s a terrible way to encourage interaction and leave a good impression on your visitor.

Even if you didn’t want to use a library like Filtrify, you can still come at the problem of filtering content in a user friendly way by falling back on some basic principles like LATCH. LATCH is a content filtering methodology that most users are, consciously or not, readily able to adapt to. That makes it a great place to start when trying to solve the problem of helping people find what they need in any large archive of structured information.

So how could we apply LATCH to a set of link filters for our program listings? Here’s one example (and there are plenty others):

  • Location: This could be a physical campus location, online programs, or a more meta concept like a college or school.
  • Alphabetical: This pretty much goes without saying. But keep in mind your taxonomy might not be the same as the visitors. Don’t be afraid to overload topics and point them to the same overall detail page.
  • Time: This one can be harder, but could be length of the overall program, number of credit hours, or number of total semesters.
  • Category: Think generalized subject or job areas here. For instance, “teaching” will likely return a number of different specializations.
  • Hierarchy: You could use this to break down by schools and departments, or requirements, or to set up graduate tracks

The insane part about all this is that in many cases it would only take a little work to make fairly significant usability improvements over the current lists of programs. Something as basic as a live search filter would provide users with at least a little empowerment over the current model for many schools. Empowered users will be engaged users. And it’s much easier to get an engaged user to fill out an application. And on the other hand, if the technology you’re employing on your website doesn’t instill them with faith in you to be modern and student-centric, then they’ll move on.

Majors, minors, and programs are just one of many examples that could benefit from a little of this kind of TLC. I mention it as the focus of this post mainly because it tends to be really high in the funnel though. But how about:

  • Student organizations
  • Offices and departments
  • Faculty listings
  • Events
  • Courses

How many things could you improve with just a few hours work, and a little focus on the overall UX of the content you are trying to present? Which do you think your visitors would get better use out of? Are you particularly proud of your program listing page? Share it in the comments below for others to see. And if anyone actually does build a site based on Filtrify, let me know, I’d love to see how it turns out!

Why Higher Ed Sucks at Content Strategy

Let’s face it, higher ed has problems. They have a lot of problems. Whether it’s bad coding, poor graphic design, or a lack of upkeep, someone is always talking about something that’s not working and getting plenty of sympathy from the rest of the web development community. Article after article, conference after conference we talk about all the different things we have trouble with and try to understand why it doesn’t work and what to do about it. One area that’s been getting more and more focus, in part thanks to folks like Meet Content, is content strategy (regardless of whether or not you think it’s a “Real Thing.” I’m looking at you, Karlyn). With the start of a new year, many of us are taking some time to revisit our policies and practices, and get ready for a better 2012. But there’s one big problem when it comes to content strategy for us:

We’re gonna fail.

The thing is, there are ultimately so many factors working against us, that it’s extremely difficult to find success in any kind of realistic content cycles. There are a handful of folks doing okay, and parts of sites that venture off on their own have also managed to find success, like the Financial Aid department at Ithaca College. The tough part is that despite all of the case studies and conference presentations, schools find they cannot replicate the success demonstrated by someone else. But as I’ll discuss, there’s a good reason the ones that are doing well are managing it, and it requires tough decisions.

We’re too damn big

I’ve talked to more than one DI level school that has, and I kid you not, millions of web pages. Millions. Millions. Think about that for a second.  If you checked 100 pages a day, every day for a year, you wouldn’t even manage to check the quality of 50,000 pages. If you had only one million pages, that wouldn’t even cover 5% of your site. One of the first steps in starting a content strategy is a content audit. How much of your site are you prepared to commit to that when you’re so huge? Yes, a lot of that is automatically generated or archival. Yes, not all of it is meant for normal human consumption. Yet the fact remains that when a problem is so big and you can’t even pinpoint where to start, many will choose to do nothing. Since many university sites lack any comprehensive business or marketing strategy when it comes to the creation and maintenance of content, literally every piece of information gets put out there, and it’s put out there by hoards of individuals that are ultimately not qualified to edit web sites. So we grow. And grow. And grow. Then there comes a point where you see folders that literally have ten versions of the same page, and you’re faced with the challenge of figuring out which one is “right.”

Only You Can Prevent Gray Goo

Only You Can Prevent Gray Goo

Remember my mantra. Repeat it to yourself in your sleep. Tattoo it on your forehead. Wax it into your chest hair. Do less better. Stop pretending that some day you’ll come around and find a way to control this problem. You won’t. Your users will keep producing content that will eat your site alive over time, at a rate that will outpace your ability to police it, until it’s impossible to find the pieces that are of value to your visitors. Think of it like the signal to noise ratio of your site. There is a DEFINITE line that you must mind. One of the best ways to know that you’re getting to close to it is when you get this phone call:

“Yeah, hey, we were wondering… when you do a search on the site for Billybob’s Big Adventure, our page is like the 1,337th one that shows up. It really should be first, but instead right now you see Billy Bob’s Big Adventure from 2010. See, that’s old and we changed it to Billybob for The Twitterz. Students are complaining and we have an ad going out in 7 minutes, 26 seconds to promote it. Can you fix that and make it show up first?”

Maintaining good content is an expensive process, both in time, labor, and money. Not maintaining a bunch of crappy content is sort of like running up a balance on your credit card. When the bill comes due, the interest will eat you alive.

Employee turnover

If you aren’t, imagine you’re an army of one. You leave. What happens? How many keys do you hold? How well documented are your processes? You’re the motor, the driving force behind all the important web strategies. Do you think you’ll be replaced by someone just as motivated? Just as skilled? Just as willing to work until midnight without logging comp time? Will you be replaced at all? When we experience turnover in our offices, that’s bad enough (Though I do believe the applicant pool for our positions is getting better with age). If you are one of the keystone’s of your web office, how many months of productive web time are lost when you leave? That’s a tough blow to come back from, and on it’s own can have high costs for your overall site quality.

...knows a thing or two about being an army of one...

…knows a thing or two about being an army of one…

What if your boss or VP left and was replaced with someone that had a different vision of strategy for the web? What if that person decided to gut years worth of hard work and cycle building (because they don’t trust their tools – see below). How would that impact your ability to maintain the site?

An even bigger challenge is if you have a hundred or more people across campus contributing to the site, how quickly are they getting recycled? Are you even told when these folks leave? Do you keep track of the attrition rate? 10%? 15%? More? And these aren’t usually people that know the web, love it, and breathe it like you and I do. They’re the ones calling with questions about putting an image on the right side of a page. Nevermind their writing skills. With constant turnover, and typically mediocre training programs in place, you never get to train a solid foundation of thoughtful, understanding web contributors. In cases where you do, then you stay awake at night worrying about private sector competition for those people. This is also the nightmarish trick that will turn an apparent short term success into a long term failure.

The idea of a sort of “critical mass” in your editor pool where they become somewhat self sufficient and able to help each other and stay productive, for most universities, is a myth.

Wrong chain of command

I was at my last university for going on six years. In that time, I’d had three bosses (four if you count the time I had to answer to the VP directly for a few months until my current boss was hired), four offices, and have been part of three different organizations: OIS (our version of IT), Marketing, and finally Marketing and Communication (an evolved and restructured version of #2). This is an incredibly common story. When you can’t stay in one place for more than a couple years, it’s nearly impossible to get solid processes and cycles in place – they always end up disrupted and thrown into disarray by the changes.

Where do you keep yours at?

Where do you keep yours at?

Ultimately, none of these kinds of offices – IT, marketing, development, PR, etc – are the right place for us. It’s a responsibility shell game. Web communications is a system and discipline unto itself now, and it needs to be recognized, authorized, and resourced as such. Anything else is hiding it in a silo, where it’s efforts and priorities are colored by the strategy of whomever is in charge. Moving it around doesn’t solve that issue, it only changes the flavor.

Hint: if it tastes purple, see a doctor.

We’re too established

Higher ed is changing. Slowly, but surely. Many times, it’s a tortoise and hare race, and more than once the slow pace of higher ed has been a good buffer to my benefit. But, the cycle of change isn’t coordinated enough. Our foundations are old, but solid. There are cracks, but it’s not compromised yet. Look at the pyramids. They show their age, they’re a little worse for wear after the weather, the wars, and the abuse. They also aren’t going anywhere. This is the source of much infighting in higher ed. I am not a fan of decentralized web management. I feel it breeds resentment and accomplishes little success in its results. People use decentralization as a “solution” to the “We’re too damn big” problem without consideration for how it actually functions. It’s a mismatch in the problem-solution process.

The thing is, we’re too “established” in the politics of how we got here. One of the main reasons we let everyone have a site and do their own editing isn’t because its good for the users, or good for the content, but because we don’t want the headache or the bad press for trying to take the capability away. What the hell kind of screwy strategy is that? It’s just yet another shell game – this one of responsibilities. Creating any kind of good content strategy is going to require changing the way people work on your web site, and that is diametrically opposed to the long standing tradition of “this is my site, I’ll edit it how I like.” Culture, by very use of the word, is a hard thing to change.

Sometimes you just gotta rip the band-aid off.

Looking too much at startup success

I try to read a lot. Sometimes I’m a bit more successful at that than other times. One of the huge constants I see though is that a lot of the success stories we look to outside higher ed come from tech and startup firms. MailChimp is a great example. Their Voice and Tone site is a thing to behold. And you can’t have that. Not yours. Tech firms get it. They understand the role web plays in their business strategy, and they address it properly as a result. Start ups (tech or otherwise) have the advantage of building their processes correctly from the ground up. Instead, we’ve bolted it all on, like that guy in town driving the 1989 Buick Reatta painted in gold fleck with a plywood spoiler (I REALLY wish I had a picture of that right now to share). We can’t use those examples because we aren’t them. And as I previously stated, we can’t even look within our own industry many times because schools are too unique – we can’t just replicate others’ success by rinsing and repeating.

Part of finding success is making sure your solutions fit your problems. We share many commonalities from school to school, but every problem we face requires some introspection and tailoring. It’s okay to get input from colleagues elsewhere to make sure you’re on the right track, but make sure you’re working towards your own solutions.

Focused on finish lines, not cycles

Pretty simple here. We have to get the people we work with or through to understand that the maintenance processes of a website are not something that is ever complete. It’s a cycle. You’re always doing it, and it’s not something you can ultimately step back from and wash your hands of.

That’s what she said?

We don’t trust our tools

One of the biggest and most common complaints I hear from web folks at other schools is the lack of internal validation they get on campus. They offer an opinion, are ignored, and end up having to cede to the HiPPO. For some reason that still defies much logic, we hire experts (or at least people that could be trusted with giving the advice), but administration has no interest in taking their feedback with more than a grain of salt. Slowly, I see this changing, but it’s still part of the “We’re too established” principle that will be around for a while. The web is built around challenging old world concepts, so answers to questions usually involve risk and speed (not the drug. Hopefully), and that’s uncomfortable for higher ed administration.

So what do we do? We pay consultants to tell us what we already know to slow things down a little. Something we already knew gets drawn into a six month ordeal. And when it’s all said and done, we still don’t empower our people and validate that they were right all along.

This is why our “recipes for success” often come out looking burnt and tasting like purple.

So, What Do We Do?

I hate griping for so long without offering some kind of solution, because that’s not very productive (though I know you just sat in your office for 30 minutes reading this, so don’t gripe to me about productivity. Also, I’m sorry I’m such a long winded jerk). You can boil this down to some pretty simple takeaways.

  1. Wake up – stop running the rat race. Acknowledge the fact that more than likely, the way you’re doing things isn’t really a plan for long term success. You need to be clear headed and have a strong vision if you’re going to…
  2. Get high level buy in – your boss, your boss’s boss, and your school president. Sit down with them mano-a-mano and sell your process to them. Change has to come from within, but it won’t come at all if you don’t have some big iron behind you. This will also help you as you build towards acquiring proper authority and chain-of-command.
  3. Prepare for pain – if you’re going to make real headway and do some actual good, you’re going to need to piss some people off. In the words of Colin Powell: “Being responsible sometimes means pissing people off.”
  4. Identify the right problems – one of the biggest mistakes we make is not really understanding the root of our troubles, which then leads to…
  5. Identify the right solutions – decentralization is not a solution. Make sure you have properly matched a solution to the problem you need to solve. And make sure it fits your organization and needs. Understand that all the articles and workshops in the world won’t prepare you completely for what you’ll need to do at your school to get things on the right track.
  6. Set the right goals – because this is how you’ll validate all the pain and build a new foundation using the right models.
  7. Do epic shit – seriously. Break the establishment. Smash that egg and make a delicious, digital omelet.

I apologize if you expected a bit more than some simple platitudes regarding how to get your web content on the right track. I can’t offer more than that, because the real solution is just doing a lot of hard work. And I don’t care if you call it content strategy, or marketing strategy, or web marketing strategy, or whatever. There are a million right ways to do this, and only a few wrong. The key is to work your butt off towards the goals you set, and you can’t go wrong. We’re all in the same game and playing for the same team. The difference is how you come at steps 4, 5, and 6 above. Focus on what will work for you and make your plans successful.


Photo credit: AttributionShare Alike Some rights reserved by QuinnDombrowski

Idiot’s Guide to Event Tracking

On July 23rd, 2011, I had the pleasure of presenting at the HighEdWeb Arkansas regional conference. My topic was looking at approaching our websites through incremental realignments, rather than sweeping redesigns, and doing so based on information we can get from our analytics. A core component of that topic is using Google Analytics for event tracking on parts of your site. After talking with some folks post-presentation, it was clear a deeper look at event tracking was needed to show why it’s important and how to use it.

So, what is event tracking? Let’s let Google speak for themselves on the matter first:

Event Tracking is a method available in the ga.js tracking code that you can use to record user interaction with website elements, such as a Flash-driven menu system. This is accomplished by attaching the method call to the particular UI element you want to track. When used this way, all user activity on such elements is calculated and displayed as Events in the Analytics reporting interface. Additionally, pageview calculations are unaffected by user activity tracked using the Event Tracking method. Finally, Event Tracking employs an object-oriented model that you can use to collect and classify different types of interaction with your web page objects.

event-menuCrystal clear, right? In layman’s terms, while most normal stuff in GA lets you look at data at the page level, event tracking let’s you track things, components, and happeningson pages. This is extremely important when it comes to measuring the performance of tools on your site that don’t necessarily link to a new page, or that perhaps link to pages which you can’t track. For instance, want to track how many people click the play button on a video? How about tracking how many people open up your net cost calculator which you’ve put into a modal window? Or maybe you have links repeated on a page and you want to test them against each other (GA will treat these like one link in overlay mode if they have the same href). All of this is possible through the use of events.

Understanding the Function

First, let’s look at how you put event tracking to use. Basically, you’re going to tie a GA method to anything you want to track clicks for – the simplest way is through using the onclick attribute. The method in question is _trackEvent(). It is possible to call this on page load for global events, but if you are trying to track actions on the page, you’ll combine this with the _gaq.push() method (used for asynchronous tracking) in the following manner:

That basically says: “When someone clicks this link, we want to push _trackEvent data to GA, and here’s the data.” The three fields are used for helping you segment and silo the data.

  • category – For Event Tracking, a category is a name that you supply as a way to group objects that you want to track.
  • action – Use the action parameter to name the type of event or interaction you want to track for a particular web object. (e.g. ‘click,’ ‘download,’ ‘play,’ etc)
  • label – With labels you can provide additional information for events that you want to track. (e.g. video title or the name of a file)

So, knowing that, you could do the following in a situation where you needed to track form interactions:

In this case, by changing up the action, I can track the same basic information as it is used in different manners, in this case folks who would prefer to print the document versus downloading a PDF version.

Other Uses

Trying to think of ways to use event tracking? Consider these.

Outbound Links

This can be especially useful if you currently don’t have a way to track people once they enter the funnel of your university application that might live on another server. By tossing event tracking on the link, you can at least see how many people are clicking to start an application, and compare that to the number of completed online apps to get your dropoff rate for the overall funnel. Likewise you can track clicks into any of your third party software platforms so you can get some idea of how frequently they are used.

PDF Downloads

This is a common question: “How do I track X, when X isn’t a webpage I can put analytics code on?” Normally, it’s asked in the case of PDF, Word document, and other types of downloads. GA is good for tracking web pages, but sometimes you want to track non-HTML page content. This is not the only way to do it, but is a useful and simple one.

Multimedia Interaction

Using a video player on your site? If you have a player, especially one that is non-YouTube, you might be lacking in information about how people use it. If your player supports callbacks on common actions like play, pause, and stop, you can tie in to those and pass it a function to note certain activities.

This assumes your player can send a string of its status along with the callback, something like ‘play,’ ‘paused,’ or ‘stop.’ The YouTube player can do this, as can most others. Then you can use the single callback function to track any player interaction. The same basic thing could be done for an audio player, or for photo galleries or slideshows.

jQuery

So far, most of the examples focused on links, but jQuery opens up the whole DOM to interactions. Here’s an obvious use to automatically apply a download action to any links that are PDFs:

Cool, right? Expand the idea, automatically track all external links, internal page anchors, or monitor modal popup windows. jQuery opens the world to you on your pages, and let’s you wire in to basically anything that the user is going to interact with.

In the Real World

So, how can you put this to use besides the obvious? That was part of what I talked about in Arkansas. For example, every part of our base templates is now set up with event tracking. This allows me to specifically watch the usage of our header and footer links across the site, and see where they are most frequently being used. In those cases, I use “Header Links” and “Footer Links” as categories, “Click” as the action, and the link text as the label.

Sample report showing events for header link clicks

Sample report showing events for header link clicks

In this case, you can see I added the Page dimension to the report so I not only see the event labels, but where they were triggered from. Using this, over time I can measure how people use things we’ve placed in our template, and make decisions on what to change and how to change it. For example, perhaps a particular link simply falls out of usage, or we place something with the idea that people wanted it, but it turns out no one uses it. Knowing that allows us to make educated decisions and gives us a baseline to compare against.

eventTrackerSimilarly, I’ve done this with other static tools around the site. We had a box on our homepage that was being used for admission application links. A debate was taking place on whether that was really useful though. Looking at event clicks on them (note: we can’t track people once they enter the application itself, so I used this as a means to see who was trying to apply), we noticed a major gap between the number of clicks vs number of submitted apps. We saw at best 80% dropoff in the funnel (I can’t account for apps mailed in, or ones not initiated from the home page – so the actual funnel loss is likely into the 90% range).

box-comparisonSo, we set about finding a better way to use the space. Rather than linking directly to the app, we decided to link to informational pages first, with the goal of helping people get comfortable with us first since a large part of the loss was likely because people clicked the links not expecting to immediately enter the application (gotta build those relationships first). The results based on watching event numbers spoke for themselves. Pages per visit were up about 2 for those users, time on site (compared to average) was way up, and general clickthroughs increased over 50%. Without event tracking, I would have never been able to identify the problem or measure the results of the change.

In Closing

Event tracking in Google Analytics can provide a critical extra layer of depth and perspective on your existing analytics. While most normal data addresses the question of where, events allow you to watch the what. This becomes more important as you try to help your site evolve intelligently over time, rather than dumping the whole thing every few years and starting over. Events also give you an added layer of flexibility in control what is tracked, and how that happens, to meet custom needs you might have for your site. Naturally, Google has a guide for implementing the event tracker that you might want to bookmark for future reference.

Below, you can find my slide deck from Context in Chaos: Informed Design Through Analytics, and you can listen to audio of the presentation over at the HighEdWeb AR site.

Do you have your own creative use of event tracking in Google Analytics on your higher ed site? We want to know about it. Share your example in the comments down below!

Lessons Learned: Joplin Crisis Communication

This has been a bit long in coming, I know. One month ago, on May 22nd, an EF5 tornado tore one of the most destructive paths in modern history through the middle of Joplin, MO. As of today, 155 fatalities have been attributed to it. This is not new to anyone at this point. Those of you familiar with me on Twitter or Facebook no doubt took note of the small role that I played in helping with communication during that chaotic time. I learned a lot from that. A whole lot. It’s taken some time to go through it all in my head and sort things out, and this past month my family faced its own tragedy in the passing of my grandfather. Needless to say, this was something that needed to get written, but I needed to get my head in the right place first.

First off, with great luck our friends at MSSU managed to come through pretty much unharmed. They mobilized relatively quickly to use their available space for shelter and triage. Kudos to all the fine, hard-working folks there. I know many were worried early on, and it didn’t come without some cost. We must always remember, universities are large families with our faculty, staff, students, and alumni. In a large scale event, we will be impacted somehow, it is simply unavoidable. Whether tornado, hurricane, earthquake, etc, nearly all of us should at least consider the possibility of a major event and what we will do both as organizations and individually.

On the night of May 22nd, I was sitting quietly in my recliner, no doubt doing something unseemly on Facebook. My wife and I had the TV on as the news broke in, catching the start of the tornado right as it hit (at the time, they didn’t realize what it was they were catching besides some windswept debris). Within minutes, I started hearing some chatter on Twitter, and before long, the news started getting some word about the possible touchdown of a “large” tornado in Joplin. Prior to my arrival in Pittsburg (which is about 25 minutes away from Joplin as the crow flies), I was a volunteer at our county fire department back in my hometown. Such a role breeds instincts that are not easily broken. My immediate response to news of the tornado was to want to get there as quickly as possible to help those in need. Unfortunately, recent surgery on my knee precluded any possibility of such action. But that didn’t matter. I had to do something. And I had other tools.

That’s where social media came into play. I plugged in, set up filters, turned my network on its head, and burnt some saved up social capital. As @ErikSTL noted, I sacrificed a bit of sleep in the process. My apologies to those that put up with my constant barrage of information during that time. So that’s the setup. Days of non-stop Twitter, Facebook, and Google Doc usage has left me with some stuff to share, and how it’s relevant to higher ed.

#1: It’s Called “Crisis” Communication For a Reason

Look, if a disaster hits your campus, a real disaster, be prepared to throw your best laid plans to the wind. And more importantly, understand that there will be a lot of people working damn hard to do the best they can in the middle of a lot of pressure and chaos. I saw a lot of complaints early on about the lack of this or confusion about that. That’s going to happen no matter what. We plan, we train, we drill, but really nothing is going to catch all the things that can and will go wrong in an emergency. Expect failures. Cell phone towers gone or overwhelmed. Phone lines down. Servers destroyed. How many of you have communications plans that rely on the assumption that communication channels will work as expected? Take those away and what will you be left with? A crisis.

#2: It’s Okay to Sacrifice Some Accuracy For Speed

There is a balance here. I’m not saying to just put out whatever random garbage you hear on Twitter. Ultimately, you never hear people being criticized for trying to push out too much information when it’s in good faith. Odds are, some of what you say will be wrong anyway despite your best efforts. But in an emergency, people on the outside crave anything that can help them be informed. You can easily specify if something is unconfirmed or not. For example, I heard an early estimate that had the death toll at 160, before it was officially even at 100. Later, people were saying 1500 were missing. Actual number: 232. Number of people criticized for pushing the 1500 number? 0. If you send out something like that, follow up after the fact with official data to correct the earlier inaccuracy. Tell people you’re researching unconfirmed reports. That serves to protect against rumors and also comfort people that you are actively seeking information for them. You can almost always issue a correction, and that’s okay.

#3: Who Do You Trust?

In your plan (for what it’s worth), identify someone that can be a voice for the organization. It doesn’t have to be some big, official person. It needs to be someone that will care, and will be safely isolated (physically) from the events so that they can do the job and hopefully avoid the disruptions that will occur within communication infrastructure. This person will help find answers for people, and direct folks to the proper resources. They serve as the traffic cop. People want a single, trusted source for information if at all possible. More than likely, whomever runs your social media now would be ideal for this role since they already know how to tune into the channels and talk to people.

And if that scares you, while you’re debating who should be doing it during a crisis, the audience will have already gone off and started listening to a dozen different, less accurate, unofficial, but more responsive people.  It’s not about having the perfect person, it’s about having a valuable person.

#4: Get Ready For Pain

In a crisis, something that is major, be prepared for hard stuff. What follows is the hardest tweet I have ever had to reply to, to a gentleman looking for his mother.

Joplin2

It took me about ten minutes to put together a response to his message. Why? 26th and Jackson was one of the hardest hit areas. There was, quite literally, nothing left there but rubble. In a crisis though, people will reach out wherever they can for hope, and that person might be you. Be there for them, and offer it, even if you know it’s going to end badly. And this stretches beyond the immediate event. After the tornado, dozens were searching for the missing. In many cases, people were reunited through the help of Twitter, Facebook, and those on the ground. Other times, they weren’t. I would be lying if I said it didn’t get to me after a while. But if you are prepared for it, you can do a lot of good and help folks before it overwhelms you.

Mr. Bell’s mother passed away that night due to injuries sustained during the tornado. I still think about it.

#5: Know Your Tools

Joplin3Twitter, Facebook – These are obvious. But understand there are other resources available as well to help communicate in a crisis, especially with respect to preparing information once communication is restored in an area if it has been disrupted. For instance, by the morning after the tornado, I was running the largest single list of donation locations using a crowdsourced Google Doc spreadsheet. In under a week the list grew from 25 starting locations to over 200 all over the country. We worked similarly to create a list of missing persons with images, last locations, who to contact, etc until the state was able to coordinate it’s effort. Others still put together a fantastic Google Map with a number of information points.

But there’s more than that. Have you heard of Sweeper? It’s an open source tool you can use to bring in multiple sources of data and get it structured in a way that lets you normalize the information. By itself, it’s neat, but combine it with something like CrowdMap (a hosted version of the Ushahidi project), and you can begin to create a powerful, real-time, geolocation based tool. There are also services like the FEMA National Donations Management Network system that helps make sure the right resources go to the right places, and the Red Cross’s Safe and Well project, and the Google Person Finder. Get familiar with all this information, as well as local and state tools that might be made available during a crisis.

#6: Listen, Provide, Anticipate

Set up your Twitter searches, monitor Facebook pages, turn on Google alerts. Get your ears open. This is for two reasons. One, you need to know what people are looking and asking for, but secondly, you need to make sure you aren’t either missing important information or lacking it somewhere. Information can and will flow two ways, and don’t assume you know everything about what’s going on around you. Odds are, you don’t. Once you have filtered out the noise from the signal, it will allow you to respond and push out important information, as well as plan ahead. For instance, the Red Cross knows from experience that in times of disaster, people want to donate, but tend to just use it as an excuse to dump old clothing. The result? Tons and tons of useless clothes that still have to get sorted, stored, and disposed of. That all produces a lot of extra work and overhead that gets in the way of real needs. Knowing this, before donations start flowing in, they can try and discourage the donating of clothing to save that burden. By being cognizant of such things, you can keep the system moving forward smoothly, not having to pause, backtrack, and rebuild momentum each time something doesn’t go as planned.

#7: Traffic Models Change

Everything you know about where people come to your site from is useless. The entire model you have is busted when an emergency occurs. Jonathan Steffens over at MO.gov put some information together in the following slides (21-24) about how their traffic was impacted by the disaster in Joplin.

Note that even within the same week, the social media profile flipped on its head as the crisis progressed. Early on, it’s all about Twitter (instant info), later people shifted to Facebook (static, sticky info). This is important as it should impact how and where you share information. It should also go without saying that you better have pretested your hardware to make sure you can handle any unexpected surge in traffic due to an emergency. This is one of the few time that really paying attention to your analytics in real time can be very valuable.

Plan Anyway

Even though you can be pretty confident that any disaster plan you make will start breaking down the second it is invoked, it is still important to have a plan. Even stripped down, it is going to serve to provide an important foundation until more permanent or official things can happen in the wake of events. And that plan should extend beyond just you. For example, MSSU was on the opposite side of town from the tornado, but they were quite obviously impacted since they had the space and resources to play an important role in the relief efforts. Our schools are play major roles in our communities, so think beyond your walls.

Conclusions

There you have it. Personally, I think one of the most important things you can do, if you can’t be on the ground, is help communicate. Like manual labor, it’s important to help, but not get in the way. Don’t share out information that is purely speculative. Try to get data from official sources as much as possible. Never underestimate the power of the voice in situations like this. It can be calming in a sea of turbulence when someone can turn to you for help or advice. Personally, I think that was one of the biggest failings through the event – that communication lacked where it probably shouldn’t have, crisis or not. For instance, FEMA should have someone on call to do what many of us did, just serve as a point person until more official wheels can find purchase. With a single phone call they could get someone online and talking within minutes. Why hasn’t that happened?

Technology is both a huge asset, but costly taskmaster at the same time. It can do great things, but it also can’t get out of its own way. At times, it was simply impossible to really follow the Twitter stream for Joplin. It moved too fast. At that point, it becomes useless. But it doesn’t have to be that way. Tools are within our grasp, and I think before long we will be doing amazing things in times of emergency. If you are looking for ways to help in the field of crisis communications, I STRONGLY encourage you to get involved with something like the Ushahidi project. While good, I see a lot of untapped potential there, and they are running mostly on grant funding and could use some good community resources to develop with.

Photo Credit: CC by-nd 2.0 by twi$tbarbie (derivative by permission)

Flipbooks: Weighed, Measured, and Found Wanting

I’m a little bit concerned with the growing frequency of folks asking the question, which I paraphase: “X has discovered flipbooks. They love the sound they make and want to put everything into it. How do I stop them?” Beyond the obvious answer to tell them flipbooks support terrorism, there are many,many reasons to avoid them. There are a number of different tools tossed around in the discussion:Issuu, Scribd, Flipping Book, enhanced HTML, etc. For the most part, they suffer many of the same faults – though not all will suffer every point I make below. You should be able to pick and choose several of these in any situation though. And if you’re in the group that didn’t know there were reasons to avoid these, well, here you go.

Read More