The Future of Search: A Different Perspective

By Anderson, Stephen P
      Publication: Design Management Review
      Date: Thursday, January 1 2009

HEADNOTE
The mission at Viewzi is to dramatically improve the search experience. But, as Stephen Anderson explains, that doesn’t mean developing a better search engine; rather, it means developing a better way to view search results. The results are a series of custom-tailored scenes whose
look and feel change depending on the intent of the search. It’s an impressive innovation that promises to reshape the Internet landscape.
Google recently celebrated its tenth birthday, sharing with the world a nostalgic recreation of the 2001 version of its site. Surprisingly, however, although there has certainly been a steady stream of subtle improvements over the years, the interface for today’s Google search engine results page (SERP) isn’t all that different from what we saw nearly a decade ago (Figure 1 on next page). This isn’t a bad thing: Google, Yahoo!, and hundreds of other search companies have invested billions of dollars to improve the algorithms that spider the Web, index pages, and bring us back the information we want. And newer entrants like Powerset and Swingly are perfecting the math behind natural language recognition. Good search results depend on the engines behind them. But as the quality of results gets ever better, are we reaching a point where value comes from more than just good search results?
Author/professor Don Norman once stated that "when technology delivers basic needs, user experience dominates." The mobile phone industry is a great example of this evolution. No one would now consider entering that mar- ket with the original Motorola DynaTAC. What made it successful at the time – a phone that doesn’t need wires – is now an expectation. Mobile phones have continued to become more reliable and usable and are packed with some useful (and not so useful) fea- tures. The introduction of the iPhone (and its imitators) raised the bar once again – not in terms of more features, but in the way in which people experi- ence information. Now our mobile phones aren’t just took – they’re also fun to use! This process of product maturity forms the basis for my User Experience Hierarchy of Needs model (Figure 2 on next page), in which I pro- pose that most technology product and service experiences go through six levels of maturity, from "Hey, this thing actually works!" to "This is meaningful in my life."
IMAGE ILLUSTRATION1
Figure 1. The interface for Google’s search engine results page (SERP) hasn’t changed all that much over the years.
So where does the search industry fall along these lines of product maturity? If you look at many of the top trends behind search, much of the focus is still on how we procure results. Mahalo asks users to submit data. Rollyo allows individuals to create their own custom search engines, with results limited to selected websites. Similarly, Lijit is a site-specific search engine (installed on your blog) that can also search your wider social network. And then there are all the dollars invested in natural-language search. These are all worthwhile ventures; capturing even one percent of the search market can make you a billion-dollar company. But which do we need: even better results, or a way to make sense of the results we’re given?
IMAGE ILLUSTRATION2
Figure 2. With six levels ranging from functional up to meaningful, the User Experience Hierarchy of Needs Model Is useful for understanding exactly where a product is in its maturity.
This is the question we asked ourselves when founding Viewzi – a new kind of search company. Instead of focusing on the search engine (we leave that work to Google, Yahoo, Microsoft, and other partners who excel at this), we focus on how people experience information. In short, we focus on search intent.
Framing the problem
As designers, we talk about the importance of properly framing a problem. For the last decade, the search problem has been framed as, "How can we make search better?" This assumes that one size fits all and that we can design one correct way to experience information. A look at any magazine rack quickly tells you that content needs vary dramatically and that people like to experienee information in many different ways. Harvard Business Review, People, and Wired each represents a different aesthetic, appropriate to the content and the customers they reach.
So what about search? Should the search results for "Paris Hilton," "hematoma," and "chicken recipes" all look the same? We think not. "How can we make your search for [name a topic] better?" is a fundamentally different question from "How can we make search better?" By starting with a specific search query (and by designing for a specific intent), we’re continuously uncovering new ways to make specific search experiences better. Think of this as custom-tailored search results.
We start at the top of the User Experience Hierarchy of Needs pyramid, asking, "What kind of search experience is appropriate for this specific search?" Then we design that screen. This has led to at least 18 unique ways to view search results, with hundreds more planned. What’s interesting about this approach is just how different the search results can be, in terms of both the presentation of information and the data sources we choose to aggregate. As you’ll see, sometimes searching the entirety of the Web doesn’t produce the best results.
Mmm, good
Quick – think about finding an interesting chicken recipe. How would you go about that task? For many people, this would likely involve flipping through a collection of recipes (from books and magazines) until we found a recipe that looked good. Recipe selection is often based on the accompanying photograph. So what about finding recipes online? A search for "chicken recipes" from most search engines yields a list of results: no pictures, just written text – same as for any other search query (Figure 3). Is this the best way to browse recipes? It’s this line of questioning that led us to create a recipe search view (Figure 4) that brings back three nodes of data: the recipe tide, source (a useful criteria for credibility), and a nice-looking photo – and all of these link to the source of the recipe.
To create this recipe search, we had to handpick a dozen leading recipe sites, limited to those that provide photos with their recipes. We’ve traded off breadth (searching the entire Web) for fidelity of information. We can deliver a more visual recipe search results page, but we may not cover every recipe out there. Considering that many people tend to search across a small number of recipe sites, we believe this was a fair tradeoff, especially if you can stay current with the most popular recipe sources. The result is a great way to search for recipes.
But this isn’t the only way people find recipes. Let’s replace our search for "chicken recipes" with "pecan pie." In this scenario, we aren’t looking for variations like, say, bittersweet chocolate pecan pie or fig pecan pie – all we want is a really good pecan pie recipe. In this case, photographs aren’t going to be all that useful; most pecan pies look alike. No, the most critical piece of data becomes the rating, or in some cases the identity, of the person providing the recipe. Designing for this scenario would lead us to a very dif- ferent recipe SERP.
IMAGE ILLUSTRATION3
Figure 3. A search for "chicken recipes" on most search engines yields a simple text list of results.
Figure 4. A search for recipes on Viewzi returns recipe photo, title, source, and a link to the source.
IMAGE ILLUSTRATION4
site serieux achat viagra Figure 5. Other ways to search for recipes.
And what about the sce- nario in which I have only a few ingredients in the pantry? In that case, being able to filter recipes by matching ingredi- ents would quickly become most important. And if I were a diabetic, filtering recipes by nutritional data would be most important. You get the idea. There’s no one right way to design a better way to search for recipes; rather, there are dozens of opportunities, based on different needs and situations (Figure 5).
IMAGE ILLUSTRATION5
Power Grid
The Power Grid view presents information in a grid fashion, with the option of displaying Screenshots or text. This interface also contains functionality to quickly "star" and launch multiple results at once. It’s perfect for certain kinds of exploratory research in which you might want to identify multiple sites that may be relevant, and then open these sites simultaneously.
IMAGE ILLUSTRATION6
Simple Text
This search view may look familiar, but we’ve thrown in a few twists. First, results are aggregated (and, where needed, de-duplicated) from both Google and Yahoo! Then we add in a Screenshot thumbnail of each result for added visual recognition.
Timeline View
Often, the timeliness of a result can be more important than its popularity or relevance, as with searching for code, recent news, or product releases. This specialized search view pulls 100 results from Google, then arranges them by date on an interactive (and dynamically scaling) timeline.
Web Screenshot
The Web Screenshot view displays a Screenshot for each search result instead of the typical text link. Aside from the visual appeal, this view is useful for visually identifying context cues (Web design styles and such) that may help in determining whether that result is relevant. This view has been especially popular with an older audience, where we’ve observed a preference for images over text. Snapshots are displayed one at a time, making it easy to focus on and find what you’re looking for.
Four Sources
One of our more experimental search views, the Four Sources view searches the leading search engines (Google, Yahoo!, Ask, and Microsoft), and displays results as Screenshots in a grid that lets you "see" how search results from different engines compare. "Stacked" thumbnails and color coding indicate the source (or sources) that retrieved that site. Users can also toggle on or off each search engine, creating a unique mix of search results. The biggest value of this is os a meta-search tool for comparing how search results match up across the leading search engines.
IMAGE ILLUSTRATION7
Figures 6 and 7. From Digg Labs to PicLens, companies are designing more-visual ways to experience information.
Is a picture worth a thousand text links?
By knowing more about the content we are showing, we can also explore options beyond the familiar list of links – and these options have, for us, tended to be of the more visual sort. In this respect, we are excited to be part a larger movement to a richer, more visual play of information. From projects coming of Digg Labs (Figure 6) to the PicLens plug-in (Figure 7), an increasing number of companies are designing more-visual, interactive ways to experience information. While rich display of digital data is certainly nothing new, it is less common to see these same ization efforts applied to dynamic data, where the length, frequency, and quantity of elements can vary widely (it is much harder create rich visualizations when you don’t have the data defined ahead of time).
IMAGE ILLUSTRATION8
Figures 8 and 9. The newest version of Microsoft’s Xbox Live favors graphical representations over text, and the new Zune MixView is a highly visual way to explore musical options.
Microsoft has made some notable investments here in at least a couple of areas. The newest version of Xbox Live has a much slicker user interface for accessing games, movie rentals, profile information, and more. It eschews text in favor of graphical representations – images of game boxes are used instead of text listings, as an example (Figure 8). In a similar move, the new Zune MixView (Figure 9) is a highly visual way to explore related music options. As one journalist described it, "What we saw made the iTunes simple Genius feature look like a blast from digital music’s past." The MixView presents a collage of 8 to 10 floating squares orbiting around a selected song, album, or artist.
Offering recommendations in a richer, more-engaging manner encourages people to keep digging around. This deeper exploration is also something we’ve seen validated in usability testing with our own Album View (Figure 10). People love the ability to browse music by album covers and band photos.
IMAGE ILLUSTRATION9
Figure 10. Viewzi’s Album View allows people to browse music search results using album covers and band photos.
In the retail space, Amazon.com is testing these same visual ideas with its WindowShop technology (Figure 11). In WindowShop, people browse through featured products displayed as tiles in a larger grid. While it does require direct engagement from the user (navigation is done via arrow keys), it’s a fairly passive way to experience information, prioritizing visual and auditory senses. Movies and games have trailers. Toys have commercials. Even book reviews have been dictated, so you don’t have to read a word. The whole experience is more like channel surfing than clicking on text links.
What’s ironic is that none of this is new to the design profession. From semiotics to Gestalt psychology, we’ve known for years the positive effect that visual cues and layouts have on perceptions. And neuroscience is showing that sorting massive amounts of information using our visual cortex far outstrips our ability to sort using textual analysis. So why has technology taken so long to move in this direction? There are several explanations. One, the technologies for delivering rich user interfaces are still evolving and gaining adoption. Two, this is a natural response to an overload of information. As volume of content increases exponentially, the need for tools to make sense of this data and find exactly what one is looking for becomes even more important. Three, from APIs (application programming interfaces, which enable software applications to communicate with each other) to Yahool’s recent opening up of its search data, the technology that makes all this possible has only recently become available.
Is this the future, for real?
Imagine doing a search for "Michael Jordan sports scores" and being thrown into something that’s more like a portal page – sports-themed, with a listing of sports scores that has more in common with the back of a baseball card than search results. Sound farfetched? Given our expectations of search engines over the past decade, it’s easy to believe that simple text results are something sacred. And while these basic search results aren’t going away any time soon, there are some signs – mostly from Google validating that the search space is moving to a more customized approach to search engine results pages. For starters, Google has already developed a number of customized search engines for certain categories. Most of us are familiar with image search, blog search, video search, or news, but did you know Google also has an album search and car buying results pages? Dozens of what we would call search views already exist from the world’s largest search engine.
It’s interesting to contrast this with Yahoo!, which has created or acquired verticals in dozens of topic areas. One criticism of Yahoo! has been its inability to coordinate all these properties. However, were it to do so, and incorporate something like unified search, we’d have a gateway to quite a bit of rich site information.
IMAGE ILLUSTRATION10
Figure 11. Amazon’s WindowShop technology allows shoppers to browse through a grid of products.
IMAGE ILLUSTRATION11
In 2007, Google introduced unified search, which (when appropriate) mixes other relevant search results into the regular text search results. Consider a search for the band Radlohead. News, videos, and blog posts related to Radiohead are mixed into the regular text results (but not necessarily In the top three results).
When you refine your search by adding "album" to the search query, the top result is Google’s own index of Radiohead albums and songs.
HEADNOTE
The mission at Viewzi is to dramatically improve the search experience. But, as Stephen Anderson explains, that doesn’t mean developing a better search engine; rather, it means developing a better way to view search results. The results are a series of custom-tailored scenes whose
look and feel change depending on the intent of the search. It’s an impressive innovation that promises to reshape the Internet landscape.
Google recently celebrated its tenth birthday, sharing with the world a nostalgic recreation of the 2001 version of its site. Surprisingly, however, although there has certainly been a steady stream of subtle improvements over the years, the interface for today’s Google search engine results page (SERP) isn’t all that different from what we saw nearly a decade ago (Figure 1 on next page). This isn’t a bad thing: Google, Yahoo!, and hundreds of other search companies have invested billions of dollars to improve the algorithms that spider the Web, index pages, and bring us back the information we want. And newer entrants like Powerset and Swingly are perfecting the math behind natural language recognition. Good search results depend on the engines behind them. But as the quality of results gets ever better, are we reaching a point where value comes from more than just good search results?
Author/professor Don Norman once stated that "when technology delivers basic needs, user experience dominates." The mobile phone industry is a great example of this evolution. No one would now consider entering that mar- ket with the original Motorola DynaTAC. What made it successful at the time – a phone that doesn’t need wires – is now an expectation. Mobile phones have continued to become more reliable and usable and are packed with some useful (and not so useful) fea- tures. The introduction of the iPhone (and its imitators) raised the bar once again – not in terms of more features, but in the way in which people experi- ence information. Now our mobile phones aren’t just took – they’re also fun to use! This process of product maturity forms the basis for my User Experience Hierarchy of Needs model (Figure 2 on next page), in which I pro- pose that most technology product and service experiences go through six levels of maturity, from "Hey, this thing actually works!" to "This is meaningful in my life."
IMAGE ILLUSTRATION1
Figure 1. The interface for Google’s search engine results page (SERP) hasn’t changed all that much over the years.
So where does the search industry fall along these lines of product maturity? If you look at many of the top trends behind search, much of the focus is still on how we procure results. Mahalo asks users to submit data. Rollyo allows individuals to create their own custom search engines, with results limited to selected websites. Similarly, Lijit is a site-specific search engine (installed on your blog) that can also search your wider social network. And then there are all the dollars invested in natural-language search. These are all worthwhile ventures; capturing even one percent of the search market can make you a billion-dollar company. But which do we need: even better results, or a way to make sense of the results we’re given?
IMAGE ILLUSTRATION2
Figure 2. With six levels ranging from functional up to meaningful, the User Experience Hierarchy of Needs Model Is useful for understanding exactly where a product is in its maturity.
This is the question we asked ourselves when founding Viewzi – a new kind of search company. Instead of focusing on the search engine (we leave that work to Google, Yahoo, Microsoft, and other partners who excel at this), we focus on how people experience information. In short, we focus on search intent.
Framing the problem
As designers, we talk about the importance of properly framing a problem. For the last decade, the search problem has been framed as, "How can we make search better?" This assumes that one size fits all and that we can design one correct way to experience information. A look at any magazine rack quickly tells you that content needs vary dramatically and that people like to experienee information in many different ways. Harvard Business Review, People, and Wired each represents a different aesthetic, appropriate to the content and the customers they reach.
So what about search? Should the search results for "Paris Hilton," "hematoma," and "chicken recipes" all look the same? We think not. "How can we make your search for [name a topic] better?" is a fundamentally different question from "How can we make search better?" By starting with a specific search query (and by designing for a specific intent), we’re continuously uncovering new ways to make specific search experiences better. Think of this as custom-tailored search results.
We start at the top of the User Experience Hierarchy of Needs pyramid, asking, "What kind of search experience is appropriate for this specific search?" Then we design that screen. This has led to at least 18 unique ways to view search results, with hundreds more planned. What’s interesting about this approach is just how different the search results can be, in terms of both the presentation of information and the data sources we choose to aggregate. As you’ll see, sometimes searching the entirety of the Web doesn’t produce the best results.
Mmm, good
Quick – think about finding an interesting chicken recipe. How would you go about that task? For many people, this would likely involve flipping through a collection of recipes (from books and magazines) until we found a recipe that looked good. Recipe selection is often based on the accompanying photograph. So what about finding recipes online? A search for "chicken recipes" from most search engines yields a list of results: no pictures, just written text – same as for any other search query (Figure 3). Is this the best way to browse recipes? It’s this line of questioning that led us to create a recipe search view (Figure 4) that brings back three nodes of data: the recipe tide, source (a useful criteria for credibility), and a nice-looking photo – and all of these link to the source of the recipe.
To create this recipe search, we had to handpick a dozen leading recipe sites, limited to those that provide photos with their recipes. We’ve traded off breadth (searching the entire Web) for fidelity of information. We can deliver a more visual recipe search results page, but we may not cover every recipe out there. Considering that many people tend to search across a small number of recipe sites, we believe this was a fair tradeoff, especially if you can stay current with the most popular recipe sources. The result is a great way to search for recipes.
But this isn’t the only way people find recipes. Let’s replace our search for "chicken recipes" with "pecan pie." In this scenario, we aren’t looking for variations like, say, bittersweet chocolate pecan pie or fig pecan pie – all we want is a really good pecan pie recipe. In this case, photographs aren’t going to be all that useful; most pecan pies look alike. No, the most critical piece of data becomes the rating, or in some cases the identity, of the person providing the recipe. Designing for this scenario would lead us to a very dif- ferent recipe SERP.
IMAGE ILLUSTRATION3
Figure 3. A search for "chicken recipes" on most search engines yields a simple text list of results.
Figure 4. A search for recipes on Viewzi returns recipe photo, title, source, and a link to the source.
IMAGE ILLUSTRATION4
Figure 5. Other ways to search for recipes.
And what about the sce- nario in which I have only a few ingredients in the pantry? In that case, being able to filter recipes by matching ingredi- ents would quickly become most important. And if I were a diabetic, filtering recipes by nutritional data would be most important. You get the idea. There’s no one right way to design a better way to search for recipes; rather, there are dozens of opportunities, based on different needs and situations (Figure 5).
IMAGE ILLUSTRATION5
Power Grid
The Power Grid view presents information in a grid fashion, with the option of displaying Screenshots or text. This interface also contains functionality to quickly "star" and launch multiple results at once. It’s perfect for certain kinds of exploratory research in which you might want to identify multiple sites that may be relevant, and then open these sites simultaneously.
IMAGE ILLUSTRATION6
Simple Text
This search view may look familiar, but we’ve thrown in a few twists. First, results are aggregated (and, where needed, de-duplicated) from both Google and Yahoo! Then we add in a Screenshot thumbnail of each result for added visual recognition.
Timeline View
Often, the timeliness of a result can be more important than its popularity or relevance, as with searching for code, recent news, or product releases. This specialized search view pulls 100 results from Google, then arranges them by date on an interactive (and dynamically scaling) timeline.
Web Screenshot
The Web Screenshot view displays a Screenshot for each search result instead of the typical text link. Aside from the visual appeal, this view is useful for visually identifying context cues (Web design styles and such) that may help in determining whether that result is relevant. This view has been especially popular with an older audience, where we’ve observed a preference for images over text. Snapshots are displayed one at a time, making it easy to focus on and find what you’re looking for.
Four Sources
One of our more experimental search views, the Four Sources view searches the leading search engines (Google, Yahoo!, Ask, and Microsoft), and displays results as Screenshots in a grid that lets you "see" how search results from different engines compare. "Stacked" thumbnails and color coding indicate the source (or sources) that retrieved that site. Users can also toggle on or off each search engine, creating a unique mix of search results. The biggest value of this is os a meta-search tool for comparing how search results match up across the leading search engines.
IMAGE ILLUSTRATION7
Figures 6 and 7. From Digg Labs to PicLens, companies are designing more-visual ways to experience information.
Is a picture worth a thousand text links?
By knowing more about the content we are showing, we can also explore options beyond the familiar list of links – and these options have, for us, tended to be of the more visual sort. In this respect, we are excited to be part a larger movement to a richer, more visual play of information. From projects coming of Digg Labs (Figure 6) to the PicLens plug-in (Figure 7), an increasing number of companies are designing more-visual, interactive ways to experience information. While rich display of digital data is certainly nothing new, it is less common to see these same ization efforts applied to dynamic data, where the length, frequency, and quantity of elements can vary widely (it is much harder create rich visualizations when you don’t have the data defined ahead of time).
IMAGE ILLUSTRATION8
Figures 8 and 9. The newest version of Microsoft’s Xbox Live favors graphical representations over text, and the new Zune MixView is a highly visual way to explore musical options.
Microsoft has made some notable investments here in at least a couple of areas. The newest version of Xbox Live has a much slicker user interface for accessing games, movie rentals, profile information, and more. It eschews text in favor of graphical representations – images of game boxes are used instead of text listings, as an example (Figure 8). In a similar move, the new Zune MixView (Figure 9) is a highly visual way to explore related music options. As one journalist described it, "What we saw made the iTunes simple Genius feature look like a blast from digital music’s past." The MixView presents a collage of 8 to 10 floating squares orbiting around a selected song, album, or artist.
Offering recommendations in a richer, more-engaging manner encourages people to keep digging around. This deeper exploration is also something we’ve seen validated in usability testing with our own Album View (Figure 10). People love the ability to browse music by album covers and band photos.
IMAGE ILLUSTRATION9
Figure 10. Viewzi’s Album View allows people to browse music search results using album covers and band photos.
In the retail space, Amazon.com is testing these same visual ideas with its WindowShop technology (Figure 11). In WindowShop, people browse through featured products displayed as tiles in a larger grid. While it does require direct engagement from the user (navigation is done via arrow keys), it’s a fairly passive way to experience information, prioritizing visual and auditory senses. Movies and games have trailers. Toys have commercials. Even book reviews have been dictated, so you don’t have to read a word. The whole experience is more like channel surfing than clicking on text links.
What’s ironic is that none of this is new to the design profession. From semiotics to Gestalt psychology, we’ve known for years the positive effect that visual cues and layouts have on perceptions. And neuroscience is showing that sorting massive amounts of information using our visual cortex far outstrips our ability to sort using textual analysis. So why has technology taken so long to move in this direction? There are several explanations. One, the technologies for delivering rich user interfaces are still evolving and gaining adoption. Two, this is a natural response to an overload of information. As volume of content increases exponentially, the need for tools to make sense of this data and find exactly what one is looking for becomes even more important. Three, from APIs (application programming interfaces, which enable software applications to communicate with each other) to Yahool’s recent opening up of its search data, the technology that makes all this possible has only recently become available.
Is this the future, for real?
Imagine doing a search for "Michael Jordan sports scores" and being thrown into something that’s more like a portal page – sports-themed, with a listing of sports scores that has more in common with the back of a baseball card than search results. Sound farfetched? Given our expectations of search engines over the past decade, it’s easy to believe that simple text results are something sacred. And while these basic search results aren’t going away any time soon, there are some signs – mostly from Google validating that the search space is moving to a more customized approach to search engine results pages. For starters, Google has already developed a number of customized search engines for certain categories. Most of us are familiar with image search, blog search, video search, or news, but did you know Google also has an album search and car buying results pages? Dozens of what we would call search views already exist from the world’s largest search engine.
It’s interesting to contrast this with Yahoo!, which has created or acquired verticals in dozens of topic areas. One criticism of Yahoo! has been its inability to coordinate all these properties. However, were it to do so, and incorporate something like unified search, we’d have a gateway to quite a bit of rich site information.
IMAGE ILLUSTRATION10
Figure 11. Amazon’s WindowShop technology allows shoppers to browse through a grid of products.
IMAGE ILLUSTRATION11
In 2007, Google introduced unified search, which (when appropriate) mixes other relevant search results into the regular text search results. Consider a search for the band Radlohead. News, videos, and blog posts related to Radiohead are mixed into the regular text results (but not necessarily In the top three results).
When you refine your search by adding "album" to the search query, the top result is Google’s own index of Radiohead albums and songs.
Clicking Into this area reveals all albums (with album cover art) and songs by the band – and even options to buy I While not as visual as the Viewzi album view, Google has essentially created a SERP specific to albums from a band.
As we see more and more topic experts and vertical search engines gaining prominence, the largest search company in the world taking tiny steps in the custom-tailored direction, and scores of companies big and small moving toward more-visual interfaces, one has to wonder what might search results look like in 10 years?
But it’s all about a better experience
Obviously, there’s a lot more to search than the results pages. And there’s a lot more to search than the user interface. In fact, a new wave of companies is redefining what it means to search for something – especially in the mobile space. Shazam identifies songs by essentially listening to a few seconds of music. With Snapshot, you can get product information by simply taking a picture of an album or book. And back online, companies like Idee allow you to find photos based on like colors. The point? The search space – from searching for to consuming information – is wide open to be much more than simply text entry and a list of text results.
At Viewzi, we’ve chosen to focus on one aspect of search – how people experience information. To provide a better search experience, we position ourselves as brokers between the data sources (which include search engines) and the people looking for information. The result? Custom-tailored search results for different scenarios. Until recently, the search industry has been about finding information and content. We believe the next step in the maturity of search results will be a sharper focus on the activities and context behind specific search queries. And in this respect, perhaps we’re offering a glimpse at what the future of search might look like – a future that supports different perspectives.
Reprint #09201 AND23
SIDEBAR
Customized search seems fine for topical searches, but what about generic search?
The following are five different representations of basic search results (mostly from Google and Yahoo!). While the results themselves are roughly identical, or come from the same sources, the presentation of these results has been altered to support either different viewing preferences or different search activities.
IMAGE PHOTOGRAPH12AUTHOR_AFFILIATION
Stephen P. Anderson, Vice President, Product Strategy & Design, Viewzi
AUTHOR_AFFILIATION
Stephen P.Anderson is the vice president, product strategy & design, at Viewzi, where his group is focused on changing how people experience search results. He is passionate about elegant design, remarkable customer experiences, and managing maverick teams – topics he loves to write and speak about.
Prior to Viewzi, Anderson grew and led the user experience teams at both Sabre Travel Network and Bright Corner, a small creative and technology services company he cofounded in 2001. There he worked with a variety of businesses to create valuable online and offline customer experiences, with a special focus on custom business applications. Anderson has worked on Web 2.0style applications with small startups, as well as with larger usability and information architecture projects for enterprise clients such as Nokia, Frito-Lay, and Chesapeake Energy. A former high school English teacher, he brings a love for language and cognitive learning theories to the design profession. As time permits, he shares his thoughts on management and design at poetpainter.com.