A Big Day for Google. Big. Day.

Image representing Google as depicted in Crunc...
Image via CrunchBase

For those interested in search, today was a little of the Christmas-come-early variety. Google announced a group of new features that may well change how humans interact with news on the Web and on their mobile phones.

Starting with the star of the show, Google unveiled its long anticipated real-time search. Following partnerships with Facebook, MySpace, Friendfeed, Jaiku, Identi.ca, and Twitter, the new Google results page will show the traditional popular items along with the latest breaking items from the real-time Web. This will allow searchers to view both the most popular items as well as tweets, blog posts, and news items as they are published. Check out the sample screen shot from the Google blog:

Clicking “latest” in search options brings the goods. “Latest” will work in conjunction with Google Trend’s hot topics as well. It is not yet available to everyone, but keep a look out – it will be rolled out very, very soon.

That’s not all. Mobile is all the rage and is only becoming more popular and ubiquitous. Google recognizes this reality and has been developing fantastic mobile information tools to make search even more powerful. Google also has been banking on moving computing firmly into the atmosphere.

Google Voice is not new, but Google reaffirmed its commitment to voice search today and introduced search capability in Japanese.  Google also announced plans to move voice search way into the future with automatic translation across languages simultaneous with the search function.

Next, to compliment “My Location”, real-time traffic and turn by turn navigation, Google is looking to leverage location functionality by returning information about your surroundings. It’s called “What’s Nearby” on Google Maps, found on Android 1.6 or later. Soon, this function will be available on iPhone via a “Near me now” button. Not quite as soon, but in the new year, the results will also show local product inventory and location-specific search terms.

Finally, and perhaps the most geeky-tech-worthy announcement of the day, enter Google Goggles, for mobile phones. Take a picture with the phone camera and Google will match the image to its own massive databases and return relevant information about the object. It currently works for landmarks, art objects and products. Goggles is for Android, but undoubtedly will expand as it is developed. Sounds a bit like augmented reality, search style.

You can check out more about Google Real Time here.

You can check out more about Google Mobile here.

While kudos goes to Google for pushing the search envelope even further and rushing the future, the real win here goes to the users! I can’t wait.

Reblog this post [with Zemanta]

Consulting Wikipedia Voids Conviction

The logo of Wikipedia.
Image via Wikipedia

Outside research doesn’t help a juror’s cause in Maryland: in a quest for understanding, a juror consulted Wikipedia about two medical terms that may have swayed the juror’s decision to convict a homeless man of murder. Consequently, the Maryland Court of Appeals overturned the conviction and life sentence. The case, Allan Jake Clark v. State of Maryland, was reported in yesterday’s issue of the Maryland Daily Record. According to reporter Steven Lash,

the juror’s Wikipedia search denied Allan Jake Clark a fair trial because “the right to an impartial jury embraces the right to have the case decided exclusively on the evidence that is produced in open court,” the Court of Special Appeals held in an unreported opinion.

Thus, Anne Arundel Circuit Court Judge Paul F. Harris Jr was wrong not to have declared a mistrial upon discovering that juror Alfred Rudolph Schuler had looked up the terms “livor mortis” and “algor mortis” on Wikipedia, an online reference site, and printed out the pages, the appellate court stated in its 3-0 decision.

The reasoning behind the reversal is not news to attorneys: consulting any information outside that presented in the course of the trial is potentially prejudicial and grounds for mistrial. What is interesting is that jurors — and people in general — think nothing of turning to the internet for answers to any question, including the meaning of scientific and medical terms like “livor mortis” and “algor mortis.” Juror Alfred Rudolph Shuler didn’t even consider his actions to be outside research: the article refers to Shuler’s explanation of his activities – ““I did go to Wikipedia and I looked up the meaning of ‘lividity,’” Schuler told his questioners, referring to the general term for blood flow after death. “To me that wasn’t research. It was a definition.”” The trial court permitted the case to continue after discovering printed copies of the Wikipedia entry, thereby muddying the grounds for the conviction.

This wasn’t the first time this year a Maryland court considered this issue. In May, the appellate court overturned a conviction in Wardlaw v. State because a juror had looked up the term “oppositional defiant disorder.”

Researching information on the internet has become second nature to many, to the point where looking up arcane bits of specialized information on Wikipedia, Google, Bing or any other virtual resource is like checking the traffic report on the radio. We are truly in the Information Age, where everyone can become a scholar. All the more reason to apply care in choosing the resources to consult and demanding that those resources be accurate.

Reblog this post [with Zemanta]