If you’re in higher ed web development, you probably saw this article making the rounds criticizing university web sites. Melonie Fullick put this together along with the feedback of other Twitter users after trying to research some information from various sites. I, too, recently had some complaints doing some research on programs at institutions and finding it infuriating at times trying to get relatively simple information. I’ve talked with a couple folks about the article as well, and thought I’d give some additional commentary. Not necessarily counterpoint, or refutations, just an additional viewpoint as someone who spent years behind that curtain. Read More
This time of year brings with it a particular discussion that I always see repeated in the various higher ed circles I still follow. That topic is the question of commencement livestreams. Not if universities should be doing them – dear no, that’s long since settled – but rather what should go with it. Should they be encouraging people to take selfies, what’s the hashtag, are they curating content from Instagram to a projector, are you using this tool, that tool, and blah, blah, blah. Some recent reading I was doing also felt poignant to this topic, and so I wanted to challenge my higher ed friends out there in webdev, marketing, advancement, et cetera to a question: Why aren’t you trying to focus on the business value of commencement streaming? Read More
I know you don’t want to admit it, but we all know that you love Kitchen Nightmares as much as I do. I actually enjoy cooking shows of all sorts and like applying their lessons to my somewhat limited selection of Kansas delicacies, which are generally limited to cuts of select beef in different shapes. Seriously, ask about my steak sushi sometime. Nevermind. My point is that while watching Kitchen Nightmares today, I had a thought about the web development trade and just how much it parallels the cooking world. I want to share these thoughts.
Allow me to borrow some definitions from Wikipedia, the clear authority on all things culinary.
This person is in charge of all things related to the kitchen, which usually includes menu creation, management of kitchen staff, ordering and purchasing of inventory, and plating design. Chef de cuisine is the traditional French term from which the English word chef is derived. Head chef is often used to designate someone with the same duties as an executive chef, but there is usually someone in charge of them, possibly making the larger executive decisions such as direction of menu, final authority in staff management decisions, etc. This is often the case for chefs with several restaurants.
In many restaurants, like many web shops, this is where the story can begin and end – for better or worse. Sometimes you’re just stuck flipping the burgers on the grill by yourself. Good head and executive chefs are not only great cooks, they generally have a keen understanding of the business world as well, because at the end of the day, if the financials aren’t right then the restaurant won’t make money and then people get fired. Likewise, a good web lead will understand the business they work in, and how it ties in to things like marketing, customer service, etc. If a person is just a really good cook, that doesn’t make them a chef, and if a person is good at writing HTML, it doesn’t make them necessarily right to lead a web office. In the end, we can also distinguish between head and executive chefs in the web world. It’s similar to comparing a CWO to a Director of Web Services. Most organizations lack that level of granularity though, and like with restaurants, the duties and responsibilities will be ran from a single touch point that could be described as either or both simultaneously.
The Sous-Chef de Cuisine (under-chef of the kitchen) is the second in command and direct assistant of the Chef. This person may be responsible for scheduling and substituting when the Chef is off-duty and will also fill in for or assist the Chef de Partie (line cook) when needed. This person is responsible for inventory, cleanliness of the kitchen, organization and constant training of all employees. The “Sous-Chef” is responsible for taking commands from the Chef and following through with them. The “Sous-Chef” is responsible for line checks and rotation of all product. Smaller operations may not have a sous-chef, while larger operations may have several.
When thinking sous chef, think art or creative directors. You can also put product and project managers in this category too. These people should be fully capable of doing any of the tasks that they oversee in a pinch, but more likely they’re somewhat more managerial in nature in the office. They communicate and coordinate. They’re the ones who will go to the project meetings so the web version of the chef de parties don’t have to.
Chef de partie
A chef de partie, also known as a “station chef” or “line cook”, is in charge of a particular area of production. In large kitchens, each station chef might have several cooks and/or assistants. In most kitchens, however, the station chef is the only worker in that department. Line cooks are often divided into a hierarchy of their own, starting with “first cook”, then “second cook”, and so on as needed.
Our chef de parties are the role-specific people. Our front end developers, our interaction designers, our graphic designers. Each has their station “on the line,” and best serves their team when they are working properly in concert with the others. Over time, they’ll usually pick up the skills from other stations to augment theirs, allowing them to move in and out of spots as necessary to help out. This gives them flexibility, a broader skillset, and an understanding of team building. It’s okay if they can’t do every station well, as long as they understand how each station gets them from prep to table.
A commis is a basic chef in larger kitchens who works under a chef de partie to learn the station’s responsibilities and operation. This may be a chef who has recently completed formal culinary training or is still undergoing training.
Interns. Nuff said.
What I think is the most important part of the striation of responsibilities in the kitchen is the principle behind earning your stripes. In a serious kitchen, you don’t just come in and become sous chef without the requisite experience. No, you start as the commis, and you work your way up. By the time you’re a sous chef, or ready to become the head chef in your kitchen or somewhere else, you know and understand the roles under you because you’ve been there. It helps you respect and understand those that work for you. Folks who have worked with me in the 24 Hour Plays have heard me give a very similar speech as it relates to theatre – that I find it very important that if you want to direct, you should spend some time as an actor, and a technician, etc. Be the ball.
Ask any ten chefs what skills are most important to being great at the craft, and you’re likely to get a broad spread of answers. But like with any trade, there are a few very atomic skills that are fairly consistently important. For instance…
A chef with a bad palate is like a web developer that uses Frontpage. Having a well trained palate is imperative to the process of selecting and procuring quality ingredients for meals. We aren’t buying beautiful rockfish or delicious cuts of beef, but we are selecting CSS frameworks and jQuery plugins. We have to have a good sense of color and design. We need to know if WordPress is the best platform for a new site, or Drupal. Our ability to “taste” the environment we’re developing and pick the right components to combine into it can make or break a project. You know you’ve developed a good palette when you can look at a website and tell what CMS it uses. Likewise if you can make the CMS you use disappear entirely to the user. They know it’s there, but you know how to keep it all perfectly balanced. This also gives you the instinct to tell when something is going wrong. When ingredients have “soured,” or you’re using too much of something, or not enough of something else.
Mise en place/Hygiene
Mise en place is a French phrase which means “everything in place.” It’s used to refer to having all your ingredients and tools ready and where they belong. I’m grouping this with hygiene because good mise en place skills inherently reinforce kitchen cleanliness and maintenance as well (I’m not just talking about washing your hands after peeing). Web mise en place would be everything from making sure your workplace ergonomics are well planned, to making sure that you lay out your software and tools in a way that makes sense and encourages good development. Just because our “countertop” is digital, doesn’t mean you shouldn’t think about how you have windows open and arranged, for instance. Web hygiene means coding clean – maintaining comments, keeping HTML semantic, and not leaving old code out to rot on the countertop, so-to-speak. A sloppily coded and presented website can be a turn off to visitors. Good web hygiene is a trait of good attention to core user experience on your site.
This is a pretty direct metaphor. The chef’s knife is an extension of their hand, arm, and body – as is our Wacom stylus, magic touchpad, or mouse. It’s all about core skill competency and practice. A good chef can dice an onion blindfolded in a tenth of the time you or I would take. You should be able to spin up a Git repo, fork, and commit just as effortlessly. Mysqldump and LESS are your bitch. You can recite CSS selectors and attributes like a multiplication table. The knife needs to be sharp, and the knife is an extension of you. Those skills make your job easier, and they show to the visitor. The competency and effort that shows through is something that users will see and appreciate. Being adept at your skills won’t always change the “taste” of your end product, but the craftsmanship is something that will show through and visitors will respect and peers will admire.
When all a chef’s skills meet in the center, it makes for a magical sixth sense. Put a good chef in a room with some ingredients and give them a couple hours, and more than likely they will magically appear at the end with a beautiful dish. That’s the entire practical value in so many of the cooking competition shows – can the cook work under pressure with unknown variables? Knowing how flavors mix, understanding cooking principles, and solid plating techniques are all integral to producing fantastic dishes in a pinch, even if they don’t know what they are necessarily walking into. Because noweb developer has ever had to build something on the fly at the last minute, right? This is also how we grow as designers and developers. Anyone can follow a recipe – they’re just a Google away. But it’s what you can do with that recipe to make the end product your own that will really set you apart. If all web development was about was following a recipe, we would have been out of a job long ago, because solutions would all be simple cut and paste jobs a monkey can do. There are times when you can even get away with that, and certainly we all have. But those are times where you’ll never stand apart, and never produce anything unique.
Oh, and then there’s…
…that important fact about the sheer way work gets done. If you get nothing else from my rambling above, take this away with you. You know one of the common, recurring themes on Kitchen Nightmares regarding why restaurants are failing? Menus and procedures that are forced upon the chef without regard or respect for the chef’s role and abilities. People playing in kitchens that don’t respect ingredients. The head chef is rarely the top person in the restaurant – someone else probably owns it – but successful restaurants know how to put the chef in charge of the menu and allows them to run the kitchen their way to coordinate the production and distribution of food, taking orders from the wait staff and cycling out the finished product. A president, dean, or director of marketing should be able to trust in their head web person implicitly – and it’s that inability to trust that I see repeated over and over as a core complaint from our peers at other institutions.
It’s that faith and commitment to excellence that makes the difference between a truly successful kitchen and restaurant, a perfectly mediocre one, and one that ends up with its doors closed.
Blame Travis over at EMG for this article – he pulled me into this question on Facebook about what higher ed should be doing about Pinterest. It was something that shouldn’t have riled me up, and yet it did. It doesn’t help that many of you may have also caught the article over on the CASE blog about Pinterest as well, and are now wondering: “Is that for us?” It was written to showcase the work Oberlin is putting in to their Pinterest account – which I applaud their effort, if not their methods. Warning: This is a rant. You can jump to the end if you’d like to read the Storify I put together to summarize the discussion several of us had this morning on this topic.
Back around 2008 or so, I made fun of Twitter. Why the hell would people want to “microblog,” I asked. But, the idea of Twitter was pretty new and I’m nothing if not even handed (usually), so I set up an account to at least try it out before writing it off. Nearly 28,000 tweets later, I think the results of that are clear. I’m doing the same thing to Pinterest now (in criticizing its purpose and usefulness), and yes, I recognize the hypocracy in that. Pinterest is novel, sure. But I definitely think that higher ed should not be himming and hawing about whether or not to use it (to be specific, I’m talking about usage institution-wide, as a marketing and community engagement tool – as a classroom tool, that’s another discussion). Pinterest has been compared as a more visual alternative to Tumblr by some. I think for current examples, that’s a relatively fair comparison. In that light, consider this:
Remember Plurk? Do you recall how it was going to revolutionize the Twitter audience and experience, and instead mostly faded into obscurity and instead became the MySpace of microblogging? And in their defense, I do feel like Plurk was ultimately a better tool than Twitter – sometimes it’s just hard to fight the power of first-to-market. Imagine then if we’d all ran out and set up our walled gardens in Plurk, how much could have been wasted. And imagine if you’d invested time and effort into Plurk without doing the same for Twitter. That’s not to say that we shouldn’t try new things. As Dylan Wilbanks pointed out:
@fienen I agree it's not worth the time to just jump on this week's trend, but someone has to pioneer.
— Dylan Wilbanks, Human Grumpy Cat (@dylanw) January 12, 2012
But also feel that there is a fundamental difference between the mentality of “We should do this because we think we can do something new and awesome” and “I wonder if we should do this new thing because it seems cool and trendy and might be popular later.” Much of higher ed falls in to the latter group, in my experience. Being a pioneer is hard. I like hard, but it’s a rare place that can always innovate, and always try new things, and be successful enough at it to keep it up (*cough*Google*cough*). The other thing to consider is that we’re just now really settling in to our social media properties. For most institutions this is Facebook, Twitter, and YouTube, with a few branching out into LinkedIn, Google+ and Tumblr as well. In the case of Plurk/Twitter above, how wise do you think it would be to invest effort into Pinterest without first having gotten the hang of Tumblr? I know that’s a fragile discussion, as first-to-market is no guarantee (just ask Friendster and MySpace), but it’s certainly a good place to start and ride it out until the community at large tires of it in some cases.
And that brings me to the core issue.
Do less better.
If you can’t sit there and tell me that you’re at all the “common” spots, and that you’re doing great at them, then why would you consider branching out even farther? I won’t fault you for name reserving an account just in case, but if you are investing time and effort into setting up a presence there, are you prepared to break down what that investment is worth? In the case of Pinterest, the largest part of the audience is currently women between the ages of 25 and 34. Is that your target audience? Is that time more valuable than other methods of sharing user generated content through existing networks and tools? If you are established on Facebook and Twitter, we now have the resources to really dig in to the value and opportunity in those channels. That gives you the luxury of being able to wait a bit on Pinterest and see just where it is six to twelve months from now. That’s not being overly cautious or lazy, that’s being strategic. That’s showing that you know you have limited resources for community engagement, and that you’d much rather put them to use where it’s worth the most. As Krisna put it:
@fienen I'd rather have stronger interactions with what we're involved in now than to have everything & do a mediocre job across the board.
— Krisna Poznik (@krisnap) January 12, 2012
Think about your core questions: who, what, when, where, why, how. In the long run, some brands will likely find successful ways of using it. But you must remember that the more specific your audience and community is, the more specific your strategy for them should be. For us, consider:
- WHO do you plan to reach on the site? WHO will be your voice on the site?
- WHAT do you expect to share, produce, or facilitate? WHAT does the audience expect to get?
- WHEN are you going to plug it into your workflow/editorial cycles?
- WHERE are you going to get or find content at? WHERE will this fit into other existing strategies.
- WHY are you investing the time here, instead of at X, Y, or Z?
- HOW will you promote and create value in your new property? HOW will you add value to the channel?
One of the best uses I could see for Pinterest would be at a school that has highly visual arts or similar programs. Pinterest is a visual medium, and certain programs on an individual basis could find promotional success there. If you want to jump in, that’s where to start. Be strategic about your use, and pioneer creative marketing techniques tailored to the items you’re selling. Institution wide? No. Have a plan, have KPIs (key performance indicators), define how it fits your marketing and customer service strategy. Yes, you can curate and share other user generated content about the school too, but Pinterest is hardly unique in enabling that kind of functionality. How do you plan on adding value to the channel? Bottom line, if you feel like you must do it, be smart about it. It’s hard to learn from a failure when you didn’t have a plan to begin with. If you at least go in with some kind of strategy, come success or failure you can learn from the situation and do better next time.
It’s almost like higher ed is developing ADHD. So many were slow or late to the social media game, that there is now a panic that we’ll miss a boat (hint: you will miss boats. It’s going to happen. It’s not the end of the world). It’s like our immune system responded to Web 2.0 by overreacting. It’s okay to pace yourself and move slowly, as long as you’re also smart about it, and not putting it off just because you want to wait for waiting’s sake. In the end, if you’re asking “Should I be on service X,” then the answer is likely no, because it means that you already don’t have any idea what you’d do there (in the case of Pinterest, you’re going to mainly be sharing and promoting community generated content. Why do you need Pinterest to do that?). Instead, focus your efforts on being successful with The Big Three (FB, Twitter, YT), and wait until you’re comfortable enough to be smart and agile within the bigger sandbox.
I know this sounds like the “Twitter? Why Would I Want That?” conversation all over again. But I personally believe it’s much closer to the lessons learned from services like Plurk. We need to get used to the fact that new services will quickly become a dime a dozen, and it will be much more important to be smart about our resource investments rather than putting a hand into every single basket that comes up. Several of these points and a lot more are discussed in the Storify below. Feel free to share your thoughts in the comments section.
More private industry market factors coming to bear on higher ed. I’m guessing this is just the start.
In a relatively short period of time, April 1st has morphed from the old, traditional April Fools’ Day into Ignore the Internet Day. Whether you’re Google or Thinkgeek, the first has taken a special place in the heart of internet geeks everywhere, who have seized the opportunity as a chance to take a break and laugh at each other for a day. The only problem is that it has made it impossible at times to pull meaning from the noise during the day, and the quality and breadth of the jokes seems to have passed its pinnacle.
For the past several years, higher education has been able to release a little pressure through April Fools’, and this year was not any different. At the end of this article you’ll find a gallery of the different jokes schools around the country played, and the examples range from faux press releases, to Facebook gags, to homepage takeovers. And all this fun is not without some lessons. Lori Packer had a nice start to things. She put out an article reflecting on what they did at the University of Rochester, reverting to a 15 year old layout. In the post, she talks about the things that she had to edit on the page to make it work today, and laughed at some of the old techniques that were used – how will we compare today to what we’re doing in 2025? Do you think about the future of what you’re producing? Not that what we did in 1996 was bad, it was just the “old way,” and that makes for interesting lessons on growth today. I actually find a lot of enjoyment going back and looking at code (that is frequently still in use) that I wrote years ago. It’s a great opportunity to see how we’ve grown as professionals, and sometimes catch mistakes so that we don’t continue them in the future.
Karine Joly was quick on the gun too, showing off some favorite gags. She made one point that I think is absolutely worth reflecting on, too. Look at some of the creativity that comes out of April Fools’ Day (even beyond higher education). Imagine, when you just go to a designer and say “Go crazy, we need something for the day that will make people laugh,” and they come back with some of the stuff we see here. Why do we stifle that in “normal” processes? In trying to be funny, we really are in danger of starting to do things right, in a way. I think we need to do a lot better to encourage that. There’s a lot to be gained from the sort of web development that happens when you shoot from the hip. When every single move you make is measured and calculated, it can start to feel stiff and lifeless. Imperfection can bring with it a certain kind of charm all its own that you cannot manufacture.
On the other end of the spectrum, we have the schools that do nothing. Hint: nearly all of them. Now, it’s easy to say that we simply can’t take the time away from pressing needs for jokes, and I do respect that (we didn’t do anything for exactly that reason). But, there are schools that don’t do it out of fear. They are worried that one day of in context joking will somehow damage their reputation. Additionally, as was brought up on the UWebD group, there could be international considerations among prospective students that might not get the joke. Valid considerations, all. But, I counter with this: Google does it. They are neither worried about people not getting the joke, nor do they question if people understand what April Fools’ Day is. The day online is so heavily saturated with April Fools’ news and talk, that my argument is that it’s pretty hard to not know there’s something special about the day in a few parts of the world. An assumption on my part, but I think a safe one. And to add to that, if you are genuinely sacrificing important functionality for the joke, you’re probably doing it wrong. There’s a whole lot to be said for subtlety. April Fools’ does not need to get in the way.
Ultimately, the decision is a measurement of risk-reward. Is there a chance you might put people off that don’t get it? Sure. But do you draw additional attention to yourself by doing it? You bet. Is it worth it? That’s a question each school has to answer on their own. I would just suggest making the decision reasonably, and based on numbers as much as possible. If the only reason you don’t do it is because administrators are scared, then you’ve just found the best reason to do it, in my opinion. If you never push boundaries, you’ll never excel.
I’m not sure if this is just my perception or not, but while I think some places are getting better at April Fools jokes, it did seem like this year there weren’t as many schools joining in as years past. What do you think? Maybe I just wasn’t as observant as normal, but I wonder if the changes to the environment the past year is starting to impact the amount of risk taking we are capable of doing. I think there’s a lot to be said for being able to not take ourselves too seriously at least for a day. My bigger concern is just that the jokes get stale very fast. It goes back to how popular the day has become online. There’s a lot of people doing the same gags, the same jokes. But I think there’s a lot of value to be had in trying to bring the funny.