All Microsoft services are for entertainment purposes only, as in you’d have to be absolutely crazy enough to use before a Microsoft sales rep has taken your execs to a steakhouse and strip club.
The scrapers should use some discretion. There are some rather obvious optimizations. Content that is not changing is less likely to change in the future.
If you’d like the opposite of this story: I was reversing out of a parking spot. I had moved about 2 feet and a drunk driver hit and ran my vehicle. The driver was charged by police because driving at 40mph and hitting cars in a shopping center in broad daylight got a lot of attention.
I gave the video footage from my car to local police and insurance company and the insurance companies ruled I was at fault because I was in reverse.
Hey thanks for sharing. I'd like to apologize for everyone below not getting the point - not understanding that these things cut both ways, and can often have unintended effects even for those enabling extra cameras for no measurable outcome for themselves. It's an incredible highlight you shared!
Right, but unless the parking spots are extra spacious, or your car is extra small, you won't be able to park your cart next to your trunk, and you have to ferry the contents of the cart from the front of your car to the trunk.
Local AI sounds nice but most of Apple’s PCs and other devices don’t come with enough RAM for a decent price needed for good model performance and macOS itself is incredibly bloated.
That's true for current LLMs, but Apple is playing the long game.
First, they are masters of quantization optimization (their 3-4 bit models perform surprisingly well).
Second, Unified Memory is a cheat code. Even 8GB on M1/M2 allows for things impossible on a discrete GPU with 8GB VRAM due to data transfer overhead. And for serious tasks, there's the Mac Studio with 192GB RAM, which is actually the cheapest way to run Llama-400B locally
Depends what you are actually doing. It's not enough to run a chatbot that can answer complex questions. But it's more than enough to index your data for easy searching, to prioritise notifications and hide spam ones, to create home automations from natural language, etc.
Apple has the ability and hardware to deeply integrate this stuff behind the scenes without buying in to the hype of a shiny glowing button that promises to do literally everything.
That might work well for Apple to be the consumer electronic manufacturer that people use to connect to OpenAI/Anthropic/Google for their powerful creative work.
I find it more likely that the entire "second" level of software companies are in OpenAI's cross hairs more so than Google. Salesforce, ServiceNow, Intuit, DocuSign, Adobe, Workday, Atlassian, and countless others are easier to pick off than Google.
Those don't seem like reasonable targets at all to me. OpenAI's product is information and their power is engagement. It's more like a cross between Facebook that thrives on engagement and Google that delivers information.
Googles biggest advancement in the last ~15 years is to produce worse search results so that you spend more time engaging with Google, and doing more searches, so that Google can show more ads. Facebook is similar in that they feed you tons of rage-bait, engagement spam, and things you don't like infused with nuggets of what you actually want to see about your friends / interests. Just like a slot machine the point is that you don't always get what you want, so there's a compulsion to use because MAYBE you will get lucky.
OpenAI's potential for mooning hinges on creating a fusion of information and engagement where they can sell some sort of advertisement or influence. The problem of course is that the information and engagement is pretty much coming in the most expensive form possible.
The idea that the LLM is going to erode actual products people find useful enough to pay for is unlikely to come true. In particular people are specifically paying for software because of it's deterministic behavior. The LLM is by its nature extremely nondeterministic. That's fully in the realm of social media, search engines, etc. If you want a repeatable and predictable result the LLM isn't really the go to product.
Not every kid born in the last five years will know Google as a verb as we do. They’ll be adults in 15 years, which is a paltry investment timeline for the type of Black Swan event we’re talking about, which AI is.
I don’t disagree with you entirely, but I’d argue the second level apps are harder to chase because they get so specialized.
Death of Google (as everyone knows Google today) is a tricky one. It seems impossible to believe at this exact moment. It can sit next to IBM in the long run, no shame at all, amazing run.
reply