nru, from LastMinute.com. Shows nearby information about entertainment. Not a game, but has a pretty, game-like (if somewhat impractical) interface which makes good use of the compass, GPS, orientation sensor, and camera. UK, G1 only for now.
nru, from LastMinute.com. Shows nearby information about entertainment. Not a game, but has a pretty, game-like (if somewhat impractical) interface which makes good use of the compass, GPS, orientation sensor, and camera. UK, G1 only for now.
Posted at 09:15 AM in Advertising, Augmented Reality, Data Mining, Entertainment, Location Aware, Marketing, Mobile | Permalink | Comments (0) | TrackBack (0)
Great post over on Jeff Jonas' blog. His point is basically that search is increasingly going to be about context, and the next competitive frontier will be in capturing context to make search smarter and more effective (gross oversimplification) check it out here.
Posted at 05:18 PM in Data Mining, KDD, Semantic Web, Web/Tech | Permalink | Comments (0) | TrackBack (0)
Jeremy Liew has a worth-reading post over on Lightspeed's blog, with a rather long winded reply from me.
Posted at 11:37 AM in Advertising, Data Mining, Marketing, Targeting, Web Marketing, Web/Tech, Weblogs | Permalink | Comments (0) | TrackBack (0)
There's a post on TechCrunch today on Peer39. Worth reading.
Peer39 is a semantic analysis-enabled ad network. The company's approach to information retrieval, their market focus, and their understanding the limits of their tech are what makes this company the first viable semantic web company. While the company does all the usual "natural language processing" heuristic stuff which has come to be synonymous with Web 3.0 / "The Semantic Web," they also do what appears to be collaborative filtering and machine learning. In other words, they are at least partly making up for the shortcomings of heuristic approaches to information extraction with statistical analysis.
As I've remarked before, it is impossible, given the current (and reasonably foreseeable) state of computer science for IE to work well enough to bring about the vision of the Semantic Web in the oft-cited travel agent example. You can do NLP query parsing, define microformats, come up with better and better ontologies, and so forth all you like, and you will never solve the problem of incompletely, inconsistently, and poorly tagged source data. Machines are too stupid and people are too lazy for all that data to ever get tagged right. These things will not change in our lifetimes.
What makes Peer39 a sensible company is that they understand this and their goal is not to create a domain non-specific, highly accurate, robust information extraction service that enables the Semantic Web. They just want to analyze content somewhat less inaccurately in order to enable ads to be served that will get a somewhat better clickthrough rate. Improving CTRs is highly measurable and gets you paid; online ad serving is one area where having a better mousetrap really will get the world beating a path to your door.
My guess as to why this company is doing it right is because the founders and key technical leaders come out of online advertising and intelligence services. The ad people know where the pain points are and what level of "better" is enough to get market traction; the ex-spies know the limits of semantic tech and information extraction because intelligence services have been using that tech in production longer than anyone - those guys know what level of "better" is truly achievable, and how. This team contrasts with most semantic web startups which are long on "visionaries" and researchers, and short on people who have had to use this tech with money (or lives, or national security) on the line.
This will be an interesting company to watch.
Posted at 01:25 PM in Advertising, Data Mining, KDD, Marketing, Semantic Web, Web Marketing, Web/Tech | Permalink | Comments (0) | TrackBack (0)
Behavioral targeting is the new frontier in internet advertising. There’s an interesting article on were we are today here in an article on eMarketer. The gist of the article is that it’s not quite there yet – despite all the buzz, there is very little being done among advertisers that takes advantage of behavioral analysis-based ad targeting technology. Partly because it’s new and immature, partly because of privacy concerns, and partly because of the adjacent content –placement problem.
More fundamentally, the real reason is that behavioral targeting is just another proxy. It doesn't solve the real problem. It gets you part of the way there, and maybe part way isn't enough to really motivate advertisers. Sure, behavioral targeting can help you understand what customers have done within the context they have been provided, and thereby helps predict what new customers exhibiting similar patterns in similar context might be inclined to do. This can be quite valuable. However, it doesn’t get to the heart of what strategic ad targeting is all about: knowing who a customer is, what her needs are, and providing something truly, personally relevant.
Behavioral targeting, like nearly all internet ad targeting and behavioral analysis, is essentially forensic. You can only predict based on what you test. You can only test within the narrow context of your existing site design, content, information architecture, etc. Run A/B tests all day long, use really spiffy experimental designs, and use all sorts of fancy math, but at the end of the day you are constrained by your context, and only get viable analysis within a narrow scope. The resulting prescriptions are tactical, not strategic.
To come up with the right creative, to find the right venue to place it, you need to know the customer – who they are, what they need – not their clickstream. Don’t get me wrong: the clickstream is a valuable data set, and is certainly better than nothing. Knowing what your customers do when confronted with a certain situation is certainly better than just guessing and hoping. However, to act strategically, you need to know your customers’ real interests, needs, and preoccupations. Fundamentally, behavioral targeting is destination-centric. Effective advertising needs to be people-centric.
Posted at 11:44 AM in Data Mining, Targeting, Web Marketing | Permalink | Comments (0) | TrackBack (0)
Today's New York Times has an article entitled "Guessing the Online Customer’s Next Want," the basic
point of which is that it giving customers good recommendations is hard.
The article, which is a nice general audience discussion of how Amazon,
Netflix, et al do their recommendations, what some people are doing to
improve it, and why it's hard to get real improvement.
The basic technique that everyone uses is collaborative filtering, and it was
invented in the early '90s by my friend and business partner Jeremy Bornstein (he holds the original patent),
among others. Collaborative filtering essentially takes a pile of data that
a person has generated, compares it to piles of data other people have
generated, and looks for similarities and differences. When a person has
a highly similar data pile to another person, or better yet, to a cluster of
other people who have similar data piles, one can infer that the areas of
dissimilarity are potential grounds for becoming more similar - i.e. all these
people who seem to share your movie/music/book/pet/whatever preferences have
bought this thing, but you don't have it yet: maybe you want it too. It works
pretty well, with pretty well being a relative thing. Boosting
sales by even a few percentage points is well worth it for most internet
retailers.
Despite collaborative filtering's being pretty good, there's lots and lots of
room for improvement. And there has been since the early '90s. The
basic thing is that, while the technique is fundamentally sound, people have
been using the same technique for 15+ years. Every year there are a few
startups that have a better recommendation engine, and the major in-house ones
get better and better, but these improvements are only little increments.
This is mainly because they come from using different data sets, more and
bigger data sets, and tweaking well-known algorithms, rather than doing
anything fundamentally new.
The Times article didn’t discuss a few things are happening now that will make
recommendations a whole lot better soon. While collaborative filtering won’t go away, it will be used in
conjunction with other techniques and the quality of customer recommendations
will get way better.
Posted at 03:39 PM in Data Mining, KDD, Social Media, Web/Tech | Permalink | Comments (0) | TrackBack (0)