Sure, having the arcade game you helped pioneer in the U.S. parodied on Malcolm in the Middle is cool (Malcolm's dad battles a young fanatic for "Jump, Jump Dance Party" supremacy—"Dance, Dance Revolution's" doppelganger). But RedOctane co-founder, President and CEO Kai Huang knew he'd really arrived when South Park riffed on Guitar Hero, the pop cultural phenomenon he and his brother, Charles, helped co-create.
In the South Park episode, "Guitar Queer-o," Mr. Kincaid (evidently, his Partridge Family royalties have run out), signs a smokin' Stan to a contract (saying "You could earn over a million points!"), but, in an effort to ease the pressures of impending rock stardom, Stan spirals down into a "Heroin Hero" addiction—a sly nod to the addictive nature of Guitar Hero itself.
The latest installment of the game, Guitar Hero World Tour, features expanded peripherals—including drums and a microphone—as well as a potential game-changer: the GH Music Studio, which enables players to compose original music (using the GH peripherals) and upload to GHTunes, a user-generated content hub, for fellow enthusiasts to download and play. According to Kai, he and Charles helped usher in the "rhythm game" revolution because they wanted everyone to be able to rock in front of adoring throngs, irrespective of innate ability. Both Kai and lead Guitar Hero producer for Neversoft, Alan Flores, are hopeful that the GH Music Studio will unleash every player's latent songwriting ability.
Neversoft, as you may recall, was betrothed to RedOctane following its split from Harmonix and acquisition by Activision in 2006. Harmonix, in turn, ran off with MTV Games and spawned Rock Band in 2007. And, while Rock Band was first out of the gate with an expanded peripheral set (guitar, drums, and vocals), GH intends to catapult over its rival with the GH Music Studio feature. "You compose through the guitar peripheral," explains Flores, which now boasts a bigger body, detachable neck, longer whammy bar, and slider.
"We tested every conceivable combination," says Kai of the GH guitar. "Three buttons, four, six, and seven." Five seemed to offer the perfect level of complexity (and simplicity) for newbies and enthusiasts alike. With five buttons, so goes the theory, your hand has to slide up and down the fret board, like a real guitarist. They also tested other interfaces, but the rock 'n roll highway UI won out in the end.
"For World Tour, we worked on our peripheral kit, including drums and a microphone, for over two years," says Flores, who relied on pros like Travis Barker (Blink-182) and Stewart Copeland (The Police) to enhance the level of verisimilitude. "Our drum kit has a nice bounce," Flores contends, "and they're velocity-sensitive, so if you hit harder, you get extra points." In addition to the drum kit, the World Tour bundle boasts a professional-looking USB microphone.
Project Lead Brian Bright says Neversoft developed a proprietary game engine for Guitar Hero III: Legends of Rock in anticipation of the Xbox 360 and Play Station 3 platforms in 2002. "Our team took six months off to develop it and, at the time, it was next generation," Bright says. But, technical advances come so fast and furious in the world of game development, it's now current—and probably on its way to defunct. The core functionality of a game engine typically includes rendering for 2D or 3D graphics, a physics engine (or collision detection) and collision response, sound, scripting, animation, artificial intelligence, networking, streaming, memory management, threading, and a scene graph.
Most game makers re-use or license existing game engines, then rely on middleware to augment the engine ("middleware" are software suites with more robust specific capabilities than the core engine). "For instance, we used Havok middleware in Guitar Hero for hair and what we call 'accessory bones,' or physics-based objects that sway with the characters," says Bright. Much of Guitar Hero (and Neversoft's other monster franchise, Tony Hawk) is frequently implemented first in script "to put powerful tools in the hands of designers like Flores," notes Bright, "but then it's coded properly so it runs fast and is optimized for the game." Despite quantum leaps in processing power on the part of manufacturers like Microsoft and Sony, they're always up against the limitations of the console's CPU.
Of course, it all starts with a notation process that is never easy, even for a simple, three-chord scorcher like "Anarchy in the U.K." by the Sex Pistols. "First, we get the master tapes," Flores explains. "Often, the artist just hands over the Pro-Tools sessions and we get the stems" (the individual components of a master mix). The heartbreaker is when the master tapes are literally that—analog tapes that have decomposed and degraded, or been lost altogether. "We hear horror stories," Flores laments, "of band members who sold master tapes in exchange for a thousand bucks, or are rotting in someone's basement."
"But once we've got the stems," Flores continues, "we tempo map the song, and start sequencing the stems into harmonic progressions that map to the guitar peripheral's buttons." There was an attempt at automating this notation process, and various scripts were written, but something always got lost in translation. "We dubbed the notation program Murder Face, because it just murdered the notation," Flores laughs. "There is no software that can match the human ear for polyphonic chord detection," Bright agrees. "We're waiting for some university research department to crack the code."
For the first time, vocal tracks were added to Guitar Hero, and Bright says they're easier to notate than guitar. "With our guitar, we have only five buttons to map myriad chords and notes," he explains, "and, once we have the 'expert' notation, we have to interpolate for 'hard,' 'medium,' and 'easy' settings." While notation may be easier for voice, the team was in virgin territory with vocal recognition. "We had to develop pitch recognition software, and that came with a whole host of issues," Bright reports. (Music recognition is mathematical analysis of an audio signal and its conversion into musical notation—usually in MIDI format—a very difficult artificial intelligence task.)
"And, we had to make sure the vocal recognition software operated quickly enough, didn't hinder the CPU, and could distinguish two vocalists singing at the same time." The company developed specialized algorithms to interpret the incredible range of the human voice. "When we're happy with the notation," says Flores, "we note-track the button presses, and add cues for lighting and pyrotechnics."
Pages: 1, 2 |