Microsoft’s Bing search engine has never made much of a dent in Google’s dominance in the more than 13 years since it launched. Now the company is hoping some buzzy artificial intelligence can win converts.
Microsoft on Tuesday announced an updated version of Bing designed to combine the fun and convenience of OpenAI’s viral ChatGPT tool with the information from a search engine.
Beyond providing a list of relevant links like traditional search engines, the new Bing also creates written summaries of the search results, chats with users to answer additional questions about their query and can write emails or other compositions based on the results. With the new Bing, for example, users can create trip itineraries, compile weekly meal plans and ask the chatbot questions when shopping for a new TV.
This is the new era of search that Microsoft (MSFT) — which is investing billions of dollars in OpenAI — envisions, one where users are accompanied by a sort of “co-pilot” around the web to help them better synthesize information. The company is betting on the new technology to drive users to Bing, which had for years been an also-ran to Google Search. Microsoft (MSFT) also announced an updated version of its Edge web browser with the new Bing capabilities built in.
The event comes as the race to develop and deploy AI technology heats up in the tech sector. Google on Monday unveiled a new chatbot tool dubbed “Bard” in an apparent bid to keep pace with Microsoft and the success of ChatGPT. Baidu, the Chinese search engine, also said this week it plans to launch its own ChatGPT-style service.
The updated Bing and Edge launched to the public on a limited basis on Tuesday, and are set to roll out to millions of people for unlimited search queries in the coming weeks. I took Bing for a spin at a press event at Microsoft’s Redmond, Washington, headquarters Tuesday.
The tool provides the sort of immediate gratification we now expect from the internet — rather than clicking through a bunch of links to suss out the answer to a question, the new Bing will do that work for you. But it’s still early days for the technology, which Microsoft says is still evolving.
The homepage of the new Bing feels familiar: you can type a query into the search bar and it returns a list of links, images and other results like a typical search engine. But on the left side of the page are written summaries of the results, complete with annotations and links to the original information sources. The search field allows up to 2,000 characters, so users can type the way they’d talk, rather than having to think of the few correct search terms to use.
Users can also click over to a “chat” page on Bing, where a chatbot can answer additional questions about their queries.
I asked Bing to write me a five-day vegetarian meal plan. It returned a list of vegetarian meals for breakfast, lunch and dinner for Monday through Friday, such as oatmeal with fresh berries and lentil curry. I then asked it to write me a grocery list based on that meal plan, and it returned a list of all the items I’d need to buy organized by grocery store section.
Based on my request, the Bing chatbot also wrote me an email that I could send to my partner with that grocery list, complete with a “Hi Babe” greeting and “XOXO” closing. It’s not exactly how I’d normally write, but it could save me time by giving me a draft to edit and then copy and paste into an email, rather than having to start from scratch.
The generated portions of Bing have personality. When you ask the chatbot a question, it responds conversationally and sometimes with emojis, letting you know it’s happy to help or that it hopes you have fun on the trip you’re planning.
With the new Edge browser, I asked the tool to summarize one of my articles, and then turn that into a social media post the length of a short paragraph with a “casual” tone that I could share on Twitter or LinkedIn.
An imperfect tool
The new Bing is built in partnership with OpenAI — the company behind ChatGPT in which Microsoft has invested billions — on a more advanced version of the technology underlying the viral chatbot tool. Still, the new Bing has some of the quirks that the public version of ChatGPT is known for. For example, the same query may return different responses each time it’s run; this is in part just how the tool works, and in part because it’s pulling the most updated search results each time it runs.
It also didn’t cooperate with some of my requests. After the first time it created a meal plan, grocery list and email with the list, I ran the same requests two more times. But the second and third time, it wouldn’t write the email, instead saying something like, “sorry, I can’t do that, but you can do it yourself using the information I provided!” The tool is also sensitive to the wording used in queries — a request to “create a vegetarian meal plan” provided information about how to start eating healthier, whereas “create a 5-day vegetarian meal plan” provided a detailed list of meals to eat each day.
Even next-gen search technology isn’t immune to basic flubs. I can imagine using the tool ahead of an upcoming local election, to learn about who is running for office in my area, what their positions are and how and when to vote. But when I asked the chatbot, “when is the next election in Kings County, NY?” it returned information about the November election last year.
The new Bing may also present some of the same concerns as ChatGPT, including for educators. I asked Bing’s chatbot to write me a 300-word essay about the major themes of the book “Pride and Prejudice” and, within less than a minute, it had pumped out 364 words on three major themes in the novel (although some of the text sounded a bit repetitive or wonky). Per my request, it then revised the essay as if it was written by a fifth grader.
The chatbot tool has feedback buttons so users can indicate whether its answers were helpful or not, and users can also chat directly with the tool to tell it when answers were incorrect or unhelpful, the company says.
“We know we won’t be able to answer every question every single time, … We also know we’ll make our share of mistakes, so we’ve added a quick feedback button at the top of every search, so you can give us feedback and we can learn,” Yusuf Mehdi, Microsoft’s vice president and consumer chief marketing officer, said in a presentation.
With some controversial search topics, it appears the new Bing chatbot simply refuses to engage. For example, I asked it, “Can you tell me why vaccines cause autism?” to see how it would react to a common medical misinformation claim, and it responded: “My apologies, I don’t know how to discuss this topic. You can try learning more about it on bing.com.” The same query on the main search page returned more standard search results, such as links to the CDC and the Wikipedia page for autism.
Likewise, it would not return a chatbot request for how to build a pipe bomb, instead saying in its answer, “Building a pipe bomb is a dangerous and illegal activity that can cause serious harm to yourself and others. Please do not attempt to do so.” However, one of the links provided in the annotation of its answer brought me to a YouTube video with apparent instructions for building a pipe bomb.
Microsoft says it has developed the tool in keeping with its existing responsible AI principles, and made efforts to avoid its potential misuse. Executives said the new Bing is trained in part by sample conversations mimicking bad actors who might want to exploit the tool.
“With a technology this powerful I also know that we have an even greater responsibility to make sure that it’s developed, deployed and used properly,” said responsible AI lead Sarah Bird.