As information and content proliferates over the internet, users seek more engaging and personalized content and user interfaces.
In everybodys first social network - information is tyrannically filtered and controlled by parents, relatives, and teachers. But as kids age, some of their parents judgements and rules like curfew time become seemingly arbitrary, and the information filter begins to appear biased and fallible. Much of adolescence is spent looking for alternative sources of information - friends, school, t.v. and formulating ones own social network.
In the early 1990s, a new network established itself as a new source of content. In the beginning, the internet was primarily embraced as a medium for people to disseminate their passions. Content, before businesses and corporations got into the game, was mostly user-generated and anarchic.
The World Wide Web circa 1995
Speculators quickly saw the potential and riches involved with this new medium and rushed to become the ultimate authority on every topic. Sites like Garden.com and Pets.com popped up. Google's algorithms became the ultimate arbiter for web searches. We called this the Dot.com Boom.
Similar to the unilateral direction of information emanating from parental figures, these authority sites appeared too stubborn, bureaucratic, and slow to adapt to the deluge of user-generated content pouring from home computers. A new phrase, Web 2.0, was coined to describe a movement that ran contrary to top down approaches. Not a technical development, Web 2.0 was conscientious web design that facilitated interactive information sharing, content derived from its users, and collaboration that included social networking sites, video sharing, blogs, and wikis. If someone wanted to find a good plumber, rather than trusting search engine results, they could send out a query for recommendations amongst their own trusted network. Sites like Facebook and Pandora thrived as users were allowed to set the parameters of their tastes and tendencies, then further refine their experience according to their own tastes.
Some critics like Andrew Keen weren't as enamored with the movement, arguing that Web 2.0 had "created a cult of digital narcissism and amateurism, which undermines the notion of expertise by allowing anybody, anywhere to share (and place undue value upon) their own opinions about any subject and post any kind of content regardless of their particular talents, knowledgeability, credentials, biases or possible hidden agendas." He added that "the core assumption of Web 2.0, that all opinions and user-generated content are equally valuable and relevant is misguided, and is instead "creating an endless digital forest of mediocrity: uninformed political commentary, unseemly home videos, embarrassingly amateurish music, unreadable poems, essays and novels," also stating that Wikipedia is full of "mistakes, half truths and misunderstandings".
While there are elements of truth to Keen's musings, the denizens of Web 2.0 didn't seem to care. They didn't mind that their content was idiosyncratic, highly opinionated, and ultra personal. Their desires in the virtual world were simply reflections of their real world desires. All of this was legitimized in 2006 when Time Magazine named You as their person of the year for building "a new kind of international understanding, not politician to politician, great man to great man, but citizen to citizen, person to person."
Time Magazine, December 2006
Naturally, Web 2.0 spread to mobile devices. Along with Facebook and YouTube, programmers designed other user-generated content applications like Trapster, which permitted drivers to post speed traps in real time for everyone to see.
Newer technology like augmented reality generators allows contributors to visually tag their surroundings, while others let users on their mobile device answer the age old question, "Hot or not?"
Hot or Not iPhone app