Alex Barnett blog

Stuff

March 2007 - Posts

Getting the Microsoft / TellMe deal done

Following yesterday's confirmation that TellMe Networks will be acquired by Microsoft, this morning I read an article by Ina Fried based on an interview with the Mike McCue CEO of TellMe, an ex-Netscape exec (via Dan Farber).

I'm highlighting this because the article provides some insight into how some of these kinds of deals are done and communicates a little of the excitement that comes along with them.

In this case, Steve Ballmer was apparently very hands-on in the forming and validation of the deal, spending Super Bowl Sunday in a conference room in building 34 next to Ballmer's office (across the path of my own in building 35) with a bunch of Microsoft and TellMe execs:

"The day before the Super Bowl, executives for Microsoft and Tellme including Ballmer and McCue, spent most of the day talking. But Ballmer, clearly looking for more details, called McCue at 11:30 a.m. on Super Bowl Sunday asking for another go around. As quick as they could, McCue and his team left the posh Hotel 1000 in downtown Seattle, ran to their Ford rental car and sped across the Lake Washington bridge toward Redmond, Wash."

Wow. The TellMe execs hearts must have been racing! Could you imagine the conversation between them in the car?...And the adrenalin rush that comes knowing you're about to get grilled, in a good way (you hope). Once there, Ballmer drove the number crunching:

"Ballmer peppered Tellme executives with questions about their business, entering the answers into an Excel spreadsheet he had started from scratch.

Three-and-a-half hours later he had built a complete business model, and one thing was clear: an acquisition made a lot of sense."

That was Feb 4. According to McCue, the deeper validation then began:

"By the time the executives left the conference room that evening, they knew they were on to something.

"We had already been talking to Microsoft across multiple areas, but this is really when it started to really come together," McCue said. "It became clear this was a serious opportunity for both companies. The next few weeks were largely spent making sure those intuitions--and the numbers on Ballmer's spreadsheet--checked out."

About five weeks after that meeting, the deal closed.

I really love this story. It reminds me of the deal cut five years ago with my previous company (now part of IBM) and the emotional ride involved: the satisfaction in the validation of the value of a company you've spent huge amount of energy in building; the natural concern that a random bombshell might kill the deal; then magic of realizing a very real payoff on the risks and sacrifices involved along the way in chasing a dream; and the excitement of moving a business into a new phase of its life. Great stuff. Congratulations to the TellMe team.

Posted: Mar 15 2007, 08:43 AM by alexbarnett | with no comments
Filed under:
Confirmed: Microsoft to Acquire Tellme Networks

The were rumours flying around earlier this week (and before that), but now Microsoft has confirmed this morning it will acquire Tellme Networks:

"Microsoft Corp. today announced it will acquire Tellme Networks, Inc., a leading provider of voice services for everyday life, including nationwide directory assistance, enterprise customer service and voice-enabled mobile search. Microsoft and Tellme share a vision around the potential of speech as a way to enable access to information, locate other people and enhance business processes, any time and from any device. Combining Tellme’s talented people and expertise in high-volume voice services with Microsoft’s platform, resources and worldwide customer reach will inspire new and innovative solutions."

Talented people indeed.

"This acquisition will mark an important step forward in Microsoft’s strategy for delivering software plus services that put people at the center of technology solutions in the office, at home and on the go. For more than a decade, Microsoft has enabled speech, handwriting and touch as forms of natural user input, making computing and digital devices easier to use. Combining Tellme’s technologies with Microsoft’s existing and future products and services will help improve the way people use voice to find, use and share information"

OK, examples? From the press release:

  • Unified communications. Tellme’s voice-enabled services and solutions for enterprise customers complement Microsoft’s unified communications voice services portfolio. This will allow customers and industry partners to build highly scalable voice solutions that leverage rich identity, presence, messaging and application integration.
  • Speech platform. Tellme’s robust voice-enabled platform helps open new doors for Microsoft’s hundreds of thousands of developers and partners to build innovative speech solutions based on open standards.
  • Mobile services and search. Tellme’s speech expertise and work in mobile search, combined with Microsoft’s innovative local and mobile search offerings, will help take the mobile search usability experience to the next level.
  • Software plus services. In the long term, Tellme technology will enhance Microsoft’s many voice-enabled applications, including the Windows Vista™ operating system, the Microsoft Office system, and mobile applications such as Windows Mobile® and Windows® Automotive.

Financial details have not been disclosed but the deal is expected to close in the second quarter of 2007.

You can tune into a conference call starting at 9:30am PST with an on-demand version available for a few days after that.

Update:

Just read Don Dodge's take on this (a director in Microsoft's Business Development team):

"TellMe will be an important piece of the mobile search puzzle. Users will be able to speak queries on their cell phones and get back results in text or speech. This is a huge step forward for mobile search and voice based search.

You might recall that I have been talking about Mobile Search and Local Search as two of the most lucrative search markets still up for grabs. It should be no surprise that Microsoft made this acquisition to go after these markets. Great move!"

Does REST need a WSDL?

First read this (written by Marc Hadley)

"This article describes the Web Application Description Language (WADL). An increasing number of Web-based enterprises (Google, Yahoo, Amazon, Flickr - to name but a few) are developing HTTP-based applications that provide access to their internal data using XML. Typically these applications are described using a combination of textual protocol descriptions combined with XML schema-based data format descriptions; WADL is designed to provide a machine processable protocol description format for use with such HTTP-based Web applications, especially those using XML."

 Then read this by Sam Ruby:

"Those that merely attempt to produce compliant WSDL based on the available specifications often find problems such as these.  But the siren call for viewing the programmable web as merely a serialization format seems unstoppable: the current incarnation is called WADL."

Then read this: Automatic Multi Language Program Library Generation for REST APIs, by Thomas Steiner

"The question of describing (REST) web services in a machine-readable way other than WSDL has been raised before[9]. However, often the motivation behind was more to get rid of WSDL rather than actually solving the REST description issues. Many suggestions are more or less ad hoc inventions designed to solve particular problems. It is to be noted that with WSDL 2.0 it is possible to describe REST services[10], but here we want to focus on some examples of non-WSDL approaches. As Sun Microsystem's Norman Walsh writes[11]: "We know the hard things are possible, we just have to make the easy things easy." "

Then see this, also by Thomas Steiner.

What do you think?

-shutting down comments on this post due to spam-

'how much' is a petabyte?

Last week I quoted a footnote to this post trying to convey the size of an exabyte:

"an exabyte is 1,000,000,000,000,000,000 bytes OR 1018 bytes - there 1024 petabytes in an exabyte or 1,073,741,824 gigabytes in an exabyte.  To give you an idea of what this means, five exabytes of information is equivalent in size to the information contained in 37,000 new libraries the size of the Library of Congress book collections."

So there are 1024 petabytes in an exabyte. But 'how much' is a petabyte? Jonathan Schwartz has provided a nice post giving us an idea of its muchness:

"A petabyte is a thousand terabytes, which is a million gigabytes, or a billion megabytes. Or 8 billion megabits. With me so far?

So if you had a half megabit per second internet connection, which is relatively high in the US (relatively low compared to residential bandwidth available in, say, Korea), it'd take you 16 billion seconds, or 266 million minutes, or 507 years to transmit the data. Can you sail to Hong Kong faster than that? At a full megabit, just divide the time in half. Even at a hundred megabits (about the highest, generally available, of any carrier I've seen), it's a few years."

Posted: Mar 13 2007, 08:24 AM by alexbarnett | with no comments
Filed under:
MSXML4 will be kill bit-ed in IE

The Microsoft XML team has announced MSXML4 will be kill bit-ed in IE in the October - December 2007 timeframe.

The team is also *strongly recommending* developers who use MSXML to program with MSXML6 and upgrade apps using older versions to MSXML6.

Good background reading on the topic is this October '06 post by Adam Wiener explaining the versioning history of MSXML and best practices.

 

 

 

Posted: Mar 13 2007, 08:19 AM by alexbarnett | with no comments
Filed under: ,
How OpenDNS solved my really annoying DNS problem

About 3 weeks ago I started getting the "Page Cannot be Displayed" error message on my machine when using my home network. It would happen occasionally at first, maybe 1 in 20 sites. When the error occurred, I'd hit refresh - that would solve it in 1 in 5. If that didn't work, I'd try the site with Firefox where the error repeated in around 1 in 2. In an hour or so, a site that didn't work previously would and a site that did work previously wouldn't. Annoying.

I couldn't replicate this behaviour on my company network, so I was sure it was either my local machine or my router. I tried the usual things to try and clear up the problem (ipconfig /release, /renew, /flushdns), enable/disable network adapters, remove temp files in browser, reboot the router, reset the router, etcetera, etcetera.  I had some hope as the ipconfig /flushdns would solve the problem in maybe 1 in 3, but then the site would get funky again in a few minutes. Annoying.

Then my wife complained that she was getting the same problem on her machine, for the past few days or so. So I boiled it down to either my router going weird on me or my ISPs dns server doing something funky. I called the ISP and they hadn't heard of any similar issue reported by other customers. I tried to replicate the problem while on the phone with the ISP (I did, eventually, after everything began working perfectly the instant I had a support engineer on the line...) but they didn't have an answer - as far as they were concerned everything was cool at their end. Annoying.

I contacted my router's manufacturer (D-LINQ). They suggested changing the dns settings on the adapters manually to 4.2.2.1 and 4.2.2.2. This worked a little better for an evening (e.g. 1 in 20 would fail instead of 1 in 10 by this point), but it didn't solve the issue, not even close. In fact, things started getting worse the next day. I was convinced it was a router issue so I popped to Fry's and bought another (DLINQ again - I'm loving their routers btw). I got home, set up the new router and yet the issue was still there. Annoying.

Believing it wasn't a problem with the machines, and believing it wasn't a problem with the ISP's dns server (well, that's what they told me), and believing it wasn't a problem with (either of) my router(s) or its settings, I believed I was at a loss and, yes, quite annoyed.

While going mad looking for other solutions online ('cause I wasn't connecting to the sites, at this time around 1 in 2...), I stumbled across OpenDNS.com. I read the FAQ (especially the privacy related stuff), read about the various other benefits and gave it a go. I followed the instructions to set my router to use the OpenDNS service and, magically, everything worked. I've been using the service (free) for a week now and I've not had the slightest problem on any machines on my network. Instantly un-annoyed.

So I did find a work-around...but I didn't find the root cause...and I'm curious. I should get back to the ISP and let them know how I fixed it - presumably the fact that the OpenDNS solution worked for me points to a problem with their dns server? We'll see.

Posted: Mar 06 2007, 10:58 PM by alexbarnett | with 2 comment(s)
Filed under:
Grazr gets financing and more.

Adam Green, co-founder and CEO of Grazr Corp who mailed me tonight and let me know Grazr has completed a Series A financing round of $1.5M. Dan Bricklin has agreed to join the Board of Directors too, (the co-creator of VisiCalc and much more recently author of WikiCalc). Nice! Adam has more details here.

Grazr is a publishing tool for feeds. It lets you display RSS, RDF, Atom, and OPML files as a widget on any webpage. All Javascript, so no software download or installation is necessary for someone to view it in a browser.

I've been tracking Grazr for a while now. A year ago, Joshua Porter, John Tropea and I spent some time with Adam on the topic of OPML and its future and recorded the conversation as a podcast. Would be fun to catch up with Adam again and see where OPML and associated tech has come in the last 12 months.

Congratulations to both Mike Kowalchik (original author of Grazr & CTO) and Adam!

Posted: Mar 06 2007, 08:48 PM by alexbarnett | with 1 comment(s)
Filed under: , ,
A "one-third probability"

Former Federal Reserve chairman Alan Greenspan was quoted as seeing a “one-third probability” of recession in the United States this year, according to an interview with Bloomberg.

I'd say he has a 33% probability of being right.

Microsoft Research TechFest

Duncan Mackenzie (on the Channel 9 team), provides a pointer to the first of a number of video interviews by Rory Blythe with some MSR folk covering the Microsoft TechFest event happening this week at the Redmond Campus:

"Every year at Microsoft there is a little internal-only conference where Microsoft Research (MSR) shows off their projects. It is very cool but very private (badges checked at the door, etc...). Well, this year it is a bit different... For MSR TechFest 2007 they've allowed folks to come in and interview the researchers, take pictures/video, whatever... crazy stuff and you get to see the results first on Channel 9."

Mary Jo Foley is covering the TechFest over at her ZDNet blog, while Joshua Allen has a little more info on the two projects Rory focusses on in the first video, DynaVis and FASTDash.

Posted: Mar 06 2007, 02:48 PM by alexbarnett | with no comments
Filed under: ,
The Expanding Digital Universe

IDC and EMC have released a new study today - 'The Expanding Digital Universe: A Forecast of Worldwide Information Growth Through 2010'. (press release here).

It's fascinating stuff. The research follows previous work conducted at the University of California, Berkeley (I've blooged this previously here). The methodology used for the IDC/EMC study varied from the Berkeley study in that Berkeley study examined the creation of original information (not including copies) and estimated how much digital information that would represent if all of it were converted to digital format (think: total amount of information we create).  The IDC/EMC study is a forecast for devices that create or capture digital information – PCs, digital cameras, servers, sensors, etc. – and estimates the total number of megabytes they capture or produce in a year (think: actual and forecasted size of the digitial universe).

So on to the interesting tidbits from the IDC/EMC study:

  • Between 2006 and 2010, the information added annually to the digital universe will increase more than six fold from 161 exabytes to 988 exabytes*, a compound annual growth rate of 57%.
  • While nearly 70% of the digital universe will be generated by individuals by 2010, organizations will be responsible for the security, privacy, reliability and compliance of at least 85% of the information.
  • Images, captured by more than 1 billion devices in the world, from digital cameras and camera phones to medical scanners and security cameras, comprise the largest component of the digital universe.
  • The number of images captured on consumer digital still cameras in 2006 exceeded 150 billion worldwide, while the number of images captured on cell phones hit almost 100 billion. IDC is forecasting the capture of more than 500 billion images by 2010
  • The number of e-mail mailboxes has grown from 253 million in 1998 to nearly 1.6 billion in 2006. During the same period, the number of e-mails sent grew three times faster than the number of people e-mailing; in 2006 just the e-mail traffic from one person to another – i.e., excluding spam – accounted for 6 exabytes.
  • Unstructured Data – Over 95% of the digital universe is unstructured data. In organizations, unstructured data accounts for more than 80% of all information.
    • The report says: "DC believes that over time it will become easier to deal with unstructured data as (1) more and more metadata is added to unstructured data, (2) structure is added to unstructured data, and (3) access systems provide structured views of both structured and unstructured data."
    • Interestingly, the study refers to the Semantic Web as a research area to follow regarding this topic.
  • Chevron's CIO says his company accumulates data at the rate of 2 terabytes – 17,592,000,000,000 bits – a day
  • Wal-Mart - reputed to have the largest database of customer transactions in the world In 2000, that database was reported to be 110 terabytes, with recordings and storage of information on tens of millions of transactions a day. By 2004, it was reported to be half a petabyte

So if the digital universe is expanding exponentially - and it looks like 'we' are the ones generating most of it (and consuming it) - how are we going to cope with the ever increasing amount of information?

----

*In case you're wondering, an exabyte is 1,000,000,000,000,000,000 bytes OR 1018 bytes - there 1024 petabytes in an exabyte or 1,073,741,824 gigabytes in an exabyte.  To give you an idea of what this means, five exabytes of information is equivalent in size to the information contained in 37,000 new libraries the size of the Library of Congress book collections.

More Posts « Previous page