Cost-wise, there's a solid chance that the Pi would have been more expensive. Jeff Geerling ran some numbers (^1) on this last year, before the current chip crisis we're in, and it was bad enough already.
Home Assistant does a surprising amount of Disk I/O, if for nothing than for logs. Sibling commenters are also advising not running it on the SD card to avoid wearing it out, so there's definitely some truth here. This means we're adding a Pi M.2 hat + SSD into the mix. The Pi5 SSD kit for 256 GB, when it was available, was around $60 USD. A Pi5 with 8 GB of RAM is $130 USD. Now we need a cooler, a case that will fit the Pi5 with said M.2 hat, and a power supply. We're already well north of $250 USD, encroaching on $300, and we're not even using the core benefits of the Pi's platform. No need for GPIO pins, tightly integrated cameras or other sensors, none of that.
For all we know, the blog author did this assessment (or trusted the assessment of others, eg: Jeff) and came to the came conclusion - it wasn't worth the price of entry.
I used to run HA on an RPi, but eventually migrated it to a similar NUC. The RPi eventually just wasn't powerful enough (peak compute needs), while the NUCs are still quite cheap. And you can run a surprising amount of Proxmox VMs and LXCs on barely a few cores and gigabytes of RAM.
The cool thing is that it's very easy to migrate it to better hardware. HA backup and restore system is highly reliable. For this reason I can definitely recommend an RPi to start with, and who knows perhaps it will be enough forever, but if not then moving is a matter of ~one evening.
Funny that I should run into this now... Just this past weekend I tried the Home Assistant backup/restore mechanism for the first time, and it failed miserably for me :-(.
First it took over an hour to create the backup, then I got a 4.42 GiB tar file, that of course failed to upload to the new Home Assistant install.
I investigated and found that the tarball was just a compressed copy of the complete installation directory of my Home Assistant setup, and that included multi gigabyte `.cache/pip` and `.cache/uv` directories :-s (my old Home Assistant install operates from a Python virtual environment that I created, and Home Assistant keeps nagging me that this installation method is deprecated, so I decided to migrate to HAOS in a VM).
When I deleted those directories the tarball was less than 200 MiB but the new HAOS VM still would not accept the upload. All I got was "500 Internal Server Error - Server got itself in trouble". And of course because HAOS is an "appliance" its kind of a black box so I couldn't find out how to get access to error logs with details :-(.
In the end I decided that the path of least resistance was to simply start from scratch based on the HAOS virtual machine and take some days/weeks to build up the new Home Assistant setup before it's mature enough to take over from the old Home Assistant setup (which is running on hardware that is close to failure).
Wow, not sure how to interpret your experience that a RPi wasn't powerful enough to manage watering a few plants. I can only suspect the overall software setup is massively bloated.
If you want to run EspHome inside HA, and you recompile the devices (every release of EH), you want a decent processor/disk. The ESP stuff is a surprisingly heavy compile for a puny microcontroller.
A recent RPi is sufficient to handle a few plants - though, yes, recompilation will take time. The ESP is a beautiful piece of software, hence I highly recommend it. My native language has an expression that describes this situation perfectly: the appetite grows with eating. Next thing you know you have 2k or more entities, and your HA even handles some video feeds.
The important thing is that it's pretty much always easy to make an upgrade thanks to the good design of the backup system. Don't forget to set up backups in either case, it's a sin to not use such a complete system :)
Recompilation has nothing, or should have nothing, to do with the requirements to run the system. If that is indeed a requirement then the system is indeed bonkers.
For a system handling sensors and actuators you should be able to run a farm off a RPi in terms of compute power. Quad-core at 2.4 GHz and up to 16 GiB RAM on a RPi 5, that's a crazy amount of compute for the use-case.
The way ESPHome works is that your device configuration is a yaml file that produces a compiled binary artifact and it can be updated OTA with wifi. The downside of this is that these updates are pushed via the device you are running HAOS and hence compiling can take a while.
HAOS is quite bloated but it's also very versatile and FOSS
There's no reason you have to run ESPHome on your Home Assistant server.
It's offered as a HA a̵d̵d̵o̵n̵ app for ease of use (makes it a one-click install), but you can also just `pip install esphome` or use the provided Docker image and get the exact same UI, but with everything (including compilation) running on your much beefier laptop.
So your binaries get compiled quickly and you can still do the OTAs directly from your laptop. HA needn't be involved.
Every time you make a change to your yaml it requires a recompile. I think now ESP Home allows you to change configurations and upload a bin compiled on your main machine so it's really not a limitation at all. Plus, once it recompiles it automagically uploads so just make your changes and forget about it
Home Assistant is indeed a massive pile of software, mostly Python. I couldn’t get it to work reliably (or, at least, usably - the web interface was painful to use) on a Pi Zero because of memory requirements and disk access speed.
…having said that, as the other poster alludes to, it’s peak requirements that are problematic. If your device can handle them, it’s not a massive power suck because idle requirmements are low.