Blaming the new leaders or aggregators for disrupting the business of the old leaders, or saber-rattling and threatening to sue are not business strategies – they are personal therapy sessions. Go ask a music executive how well it works.
I’m going to grossly over-generalize and guess that the rumored iTablet is going to be the smallest but most powerful computer we’ve seen; in other words, I completely disagree with those that suggest this is going to be a larger iPod Touch. Except for Apple TV, the company has not dissapointed in new products…so why would they choose to dissapoint now?
Anywho, I just think it’s interesting that with all this cash on hand, the only very public thing that Apple has done is buy PA Semi and hiring a bunch of “chip guys.” Next, I find it interesting that Steve Jobs once mentioned something along the lines of focusing less on the OS and more on how chips are used; I can’t find a link and I’m not familiar with the terminology, but it seemed analgous to getting our brain to use 100% of its “computing power” rather than its common 10%. Which, would make sense, now that Apple is pushing developers to support 64-bit.
So, my guess is that this new tablet will actually run something like Snow Leopard. It will be one of the smallest and most powerful computers we’ve seen. Oh, and they are going to do something amazing with the keyboard. I watched a TED video yesterday where a guy projected a keypad onto his hand and dialed a phone number; I seriously think it will be something like that.
I mean, our Dear Leader, who has a lot to say about iTablet, once wrote a little book that ended Steve Jobs taking time off to dream up a “computer built entirely out of a single sheet of touch-screen plastic that you can roll up and carry with you in a tube.” It may not be that amazing, but it’s gonna be close.
More news that makes me think I’m right. The WSJ got a leak about the Apple tablet shipping in March. They say that the development of the tablet has been going in “fits and starts” for years. Think about what has changed in Apple’s history over the past couple of years that would bring them to the point where they could make a tablet:
- Apple buys FingerWorks, paving the way for better multi-touch devices.
- Apple buys PA Semi, which The Street recently noted would provide a low power chip for the tablet. Another blogger added to The Street report by writing that PA Semi would provide ARM processors; however, it is rumored that PA Semi does not, in fact, make ARM processors. Moreover, before the company was bought, it debuted a 64-bit dual core processor that was amazingly energy efficient.
- Apple developed the Mac Air using single sheets of aluminum, allowing for a thinner and lighter design.
- The MacBook Air also cut the wires and optical drives from the computing system.
- Moving to a solid state drive allows for less energy use.
- Apple has changed how they do batteries: non-removable (which allows for more room inside the computer), more power storage overall, and up to 3 times more recharges than before.
- Next, a new patent was recently discovered that, instead of sandwiching a multi-touch screen over an LCD screen, this new technology combines the two into one piece. This would allow multi-touch devices to be even thinner than they are now.
- Finally, a former employee anonymously told the NY Times that Apple has “‘spent the past couple of years working on a multitouch version of iWork.’” This flies in the face of the HP device that simply repackages Windows 7.
I think all of this points to a growing body of knowledge at Apple that could produce something incredible. I disagree with all these reports that talk of, basically, a souped-up version of the iPod Touch…an iPod Touch with a larger screen and a “special” version of the iPhone’s OS. Think about this though: would Apple really go through creating all this hoopla and hearsay for an upgrade? I think the answer is yes only if they need to pull the wool over our fanboy(and girl) eyes ‘cause they screwed the pooch on this one for years.
Work with me here. Look at what the whole industry is introducing right now at CES: a bunch of different versions of the Kindle. That’s exactly the kind of affront to creativity that would make Dear Leader double over in agony in the Jobs Pod. Offering a cheap piece of plastic with limited functionality, aka, a “netbook” that lets you read Dan Brown’s latest novel without straining your eyes (see here and find the mention of “The Lost Symbol” to get that joke) is not Apple’s style. And they’re not going to raise the bar by adding “multimedia” options to it, such as music, YouTube videos, and, gasp, magazines. Nope.
My wager? The MacBook Pro in your pocket.
I got an opportunity to swing for the fences in search.
Piracy is essentially the consumer’s wish to have everything on demand. It’s not like people want to necessarily have it for free.
Danielle Ek, co-founder of Spotify
Last night I dreamt I was playing a hand-to-hand fighting game, but the characters were stick figures. Even better, it utilized a multi-touch interface. Your fingers would get in the way of the animation a bit, but for some reason that was okay.
Too bad I don’t know how to make an iPhone app. Let alone create a video game. All I know is it’s much better than the games I found by Googling “stick figure fighting game.”
PS-It’s the second best dream because a couple of years ago I dreamt I could fly through space without needing a spaceship. I think I was meant to be an astronaut. Or a sci-fi film director…
Like others, for a few weeks I’ve been trying to use only Bing. Honestly, for my job Google takes up most of my day, so I really was just hoping to add some variety to the mix.
I can type in a rush, and today my fingers failed me; I misspelled the search [analyt ics blog] in Bing, and it returned this:
Um… “Counter Strike”? Wasn’t that the multi-player shoot-everyone-you-see game all the guys in my dorm would play until the wee hours of the morning before their tests?
I figured this would be a good time to double-check Google’s results:
Okay, now that’s more like it. The first result is exactly what I was thinking, but couldn’t correctly type out. I think this ends the experiment for me. I know my search was misspelled, but returning “Counter Strike” was not even close in my book.
Bottom line: you can’t be a “decision engine” if you can’t read my mind (or close to it).
One of the partners just pointed out that Citigroup is probably not a good source to cite since it is widely thought to be one of the worst-run banks in the U.S. right now…
[Part II of my “overdue rambles” series]
The internet as we know it was made public in the early 90’s. Basically, it hasn’t even finished college. That’s exciting.
Why does this excite me? Because it means the internet has not come close to its full potential. I am exhilarated to see what is going to happen in the next ten years. For instance, Tom Brokaw once talked about the internet like it was Twitter. This “thing” called the “internet” was new, hip, and no one quite understood what to do with it. And that was in the 90’s. Imagine what it will be like this coming decade.
First, we need to reach (near) 100% broadband penetration. The FCC agrees that even rural broadband should be a priority. Moreover, we need the bandwidth to support that kind of access. Walt Mossberg broached this issue with John Malone:
Walt: “Why doesn’t anyone offer me, through satellite or cable, a ‘real’ Internet experience when it comes to TV, where I can get what I want, on demand? The DVRs they offer don’t cut it. Why can’t I buy a service where I turn it on and it’s like the Internet, where I get whatever I want? So it’s all Hulu, all the time, but with everybody fed into it, and with all the content?”
Malone: “It’s this nasty little thing called bandwidth.”
One change I believe will take place is that consumers will want (almost) everything, now. And the ability to deliver on that premise could be a benchmark of web maturity.
For instance, we thought the brick-and-mortar Blockbuster was a big deal, up until Netflix delivered movies to your home. But when the studios get their act together (and the nation’s internet issues are fixed), videos will then be delivered to your inbox. Netflix knows this; instead of their emails asking when you mailed or received a certain title, you will get an email that such-and-such film is now available, and whether you’d like to watch.
Yes, yes I would.
I want to rant here instead of Fred Wilson’s comment thread, since this is decidedly a tangent. You might label this nit-picking, but something really got under my skin concerning the current cover of Time Magazine. Specifically, the issues of “participation” and “innovation.”
“Genuine, Public Conversation”
First, Steven Johnson writes:
For as long as we’ve had the Internet in our homes, critics have bemoaned the demise of shared national experiences, like moon landings and “Who Shot J.R.” cliff hangers — the folkloric American living room, all of us signing off in unison with Walter Cronkite, shattered into a million isolation booths. But watch a live mass-media event with Twitter open on your laptop and you’ll see that the futurists had it wrong. We still have national events, but now when we have them, we’re actually having a genuine, public conversation with a group that extends far beyond our nuclear family and our next-door neighbors.
So, critics of the online world “bemoan” the loss of nationally shared (unifying?) experiences; Johnson replies that the old world of media was one of experiences shared in what amounted to separate “isolation booths,” while new media enables genuine interaction. Now put this alongside something I read in Gayatri Spivak’s book Outside in the Teaching Machine a few nights ago. I am taking the quote out of context; nevertheless, it made me think about the United State’s current online situation (i.e., social media):
Derrida has pointed at the ceaseless effort to construct the simulacrum of a committed and participatory public through talk show and poll […].
It is curious that, online, within the so-called Twittesphere, link-economy, social media “conversation”, et. al., we continue to construct a narrative of growing interaction, as if in the past Americans were a bunch of isolated, non-participatory loners (who happened to share national experiences, albeit in isolation).
Yet, in another field of discourse, we have some criticizing the self-congratulatory construction of an ever-growing new wave of participation. That field of criticism is warranted because this participation is virtual (and I do mean virtual, in every sense of the word). Johnson hints at this in the same article:
[T]he Twitter platform is likely to expand that strangely delusional relationship that we have to fame. […] from the fan’s perspective, it feels refreshingly intimate: ‘As I was explaining to Oprah last night, when she asked about dog ticks …’
So the new wave of participation the online world heralds is actually held together by a tenuous semblance of intimacy. I cannot overemphasize that this is no small delusion: the White House now has a Twitter account, Facebook page, YouTube and Vimeo channel, and a blog, among other initiatives. We must be careful with the “perceived” intimacy created within social media, because in the end, the joke may be on us.
“A Larger Truth About Modern Innovation”
Fred Wilson, in his post, quotes this entire section by Johnson:
The speed with which users have extended Twitter’s platform points to a larger truth about modern innovation. When we talk about innovation and global competitiveness, we tend to fall back on the easy metric of patents and Ph.D.s. It turns out the U.S. share of both has been in steady decline since peaking in the early ’70s. (In 1970, more than 50% of the world’s graduate degrees in science and engineering were issued by U.S. universities.) Since the mid-’80s, a long progression of doomsayers have warned that our declining market share in the patents-and-Ph.D.s business augurs dark times for American innovation. The specific threats have changed. It was the Japanese who would destroy us in the ’80s; now it’s China and India.
But what actually happened to American innovation during that period? We came up with America Online, Netscape, Amazon, Google, Blogger, Wikipedia, Craigslist, TiVo, Netflix, eBay, the iPod and iPhone, Xbox, Facebook and Twitter itself. Sure, we didn’t build the Prius or the Wii, but if you measure global innovation in terms of actual lifestyle-changing hit products and not just grad students, the U.S. has been lapping the field for the past 20 years.
So, the doomsayers in this context have decried declining patents; Johnson replies that American innovation is not only alive and well, but if you look in the right place it’s evident America has been “lapping the field” for two decades. My issue with this: since America, evidently, isn’t pole-vaulting over the world in Ph.D.’s and patents, Johnson simply changes the criteria of success to that of a marathon. And in that competition, we’re still ahead. Of the world. Because America’s innovation is “modern.” Thus, “lifestyle-changing.” And consequently, “hit products.” But placing innovation solely in terms of technological advancement is a problem becuase it negates other life-changing innovations outside of technology. Moreover, it enables Johnson to bypass the Wii and Prius, so that even if a country besides America produces technological innovation, he can still label them as behind…literally, they are all behind since America is lapping the whole field.
And so it is that a story that purports to tell us about Twitter’s open platform and innovation and how that is changing the way we live somehow ends up in a tangent about how America is better than the rest of the world.
Moreover, this is accomplished by Johnson by simply changing the rules of the game (e.g., whereas the measure was Ph.D’s, it’s now life-style changing products).
Am I being politically correct? YES. Nit-picky? YES. However, that it’s so *easy* to make these little slips of we’re-better-than-them bothers me. Especially when it’s labeled as the “truth.”
That pointing out us-vs-them language may be considered nit-picky… that is a larger issue.
[part I in some overdue rambles]
I think a lot of people have arrived at this realization before me. But, I am someone who learns from experience, so witnessing first-hand how the digital world has changed my daily habits has led to a more personal understanding of how the internet is changing things.
The old TV I inherited from a previous roommate doesn’t have a working DVD player. It’s sits on our floor while my wife and I decide what to with it; that’s because my wife and I catch up with 30 Rock on Hulu when we finally have a moment to sit down and watch together. Friday nights is movie night, either a DVD from Netflix, or a streaming option from the same company (thank you Starz for adding so many movies!). In the mornings, we go to NY1 and watch a few stories before heading out the door. Speaking of…
Before hitting up NY1, my wife and I are inclined to visiting the NY Times first (specifically, the article skimmer). I prefer their World, U.S., and NY news sections, as well as the Arts. I’ve noticed recently that even though I am in interactive adverting, I do not really click on the Times’ Business or Technology sections. Most likely that is because I use Google Reader to keep tabs on about 63 feeds relating to business and technology. You might have realized Google News is not a part of my daily habit… long story. Short version? I hate using it.
I read on the subway all the way to work and all the way home. I am an underliner and margin-writer. I think sooner or later all books are going to have digital copies floating around. The WSJ reported on a push to digitize old manuscripts, etc., especially as they become stolen, lost, or fall apart. Let’s hope that note-taking feature on the Kindle gets as sophisticated as I’d like…
All This Got Me To Thinking
We will reach a point where our entire lives, or, at least the majority of the content/knowledge we have encountered and appropriated (through purchase, through browsing) will be immediately searchable.
- After getting married, we were given some great new cookbooks. On busier days, we might just Google the ingredients we have on hand to find something to make; instead, I want to search which page of the cookbooks we own has a recipe with the ingredients we have on hand. Unless we have company, or know we have a busy week ahead of us, we do not plan out our meals. I’d like to search what I already have; basically, a marriage of our kitchen and our bookshelf.
- I want to search my books to find specific quotes, or even notes I’ve left in the margins.
- Songs purchased through iTunes or Amazon should come with all the information and lyrics attached, so I can search my library for specific lyrics or publishing dates.
- Transcripts of the videos I watch will be searchable. A search result for the videos will show me the most likely time points I should look, and will take me there when I click on the search result. This might even be taken a step further, where info from IMDB (such as trivia and goofs) will have links to those exact spots in the films.
- iPhoto has introduced tags to photographs… like Flickr, the info I put on my personal photos will become increasingly searchable.
Anywho, all this goes to say that we encounter an almost overwhelming amount of content every day. Services are popping up to keep track of this (from Delicious to, well, Google). In the future I think we will not just have a Google for the internet, but basically a Google for our lives. Not necessarily from Google itself (e.g., I think Apple Spotlight does a good job for me), but something along those lines where we can immediately access something that we have come across before and can’t remember where/how. We will use services that index not just the world’s public knowledge/content, but our own as well. Personalized indexing.
On top of that, more content (such as movies), and greater leveraging of this new content with older stuff (matching up IMDB info to films online). Wolfram Alpha is a step in this direction. Perhaps Google Squares as well.
Far-fetched? Old news? Comments encouraged :D