Yet another problem with MCP: every LLM harness that does support it at all supports it poorly and with bugs.
The MCP spec allows MCP servers to send back images to clients (base64-encoded, some json schema). However:
1) codex truncates MCP responses, so it will never receive images at all. This bug has been in existence forever.
2) Claude Code CLI will not pass those resulting images through its multi-modal visual understanding. Indeed, it will create an entirely false hallucination if asked to describe said images.
3) No LLM harness can deal with you bouncing your local MCP server. All require you to restart the harness. None allow reconnection to the MCP server.
I assure you there are many other similar bugs, whose presence makes me think that the LLM companies really don't like MCP, and are bugly-deprecating it.
I've got one of these! Mine is called 'roboflex' (github.com/flexrobotics). It's c++/python, not rust. But similarly born out of frustration with ros. Writing your own robotics middleware seems to be a rite of passage. Just like 'writing your own game engine'. Nothing wrong with that - ros is powerful but has legit problems, and we need alternatives.
Although tbh, these days I'm questioning the utility. If I'm the one writing the robot code, then I care a lot about the ergonomics of the libraries or frameworks. But if LLMs are writing it, do I really care? That's a genuine, not rhetorical question. I suppose ergonomics still matter (and maybe matter even more) if I'm the one that has to check all the LLM code....
Take a look at github.com/dimensionalos/dimos. We are a team making - not only a replacement for ROS - but one that can be easily vibe coded, and one with compatibility with ros and containers.
Always looking for testers and feedback if you want to influence the design/API.
A few years ago there were actually two companies trying to manufacture "zblan optical fiber" (which has better light transmission than normal optical fiber) in orbit: Made In Space, and FOMS. Both of their websites are tombstones now, afaik. The former was also attempting 3d printing in space, and was bought by Redwire.
Fascinating tech, but seemed to go nowhere.
There are now several 'manufacturing in space platform' companies, like Varda. It's not enough to just be a platform. There needs to be an actual killer app.
Borges: Selected Non-fictions. Think his fictions are good? His non-fictions, imho, are even better. You can read three sentences and feel like you just listened to a symphony - you get that constant Borges wit, erudition, mystery. The English translations are SO good. Are they even better in Spanish?
I did some robotics tactile research, it was super fun! We used "biotac" sensors, which are very capable, but are 1) crazy expensive and 2) crazy hard to replace the skins, which do wear out.
One advantage biotacs have over these is that I can send a guy a (very large) check and buy them. Most academically-sourced things like this cannot be gotten for any price. These look cool, I'd love to have a few.
Seems like you could make the skin pretty straightforwardly in a home-shop. You'd just need to 3d print TPU and embed some high quality magnets (you can remagnetize your own pretty easily probably, not cheaply though? https://www.magnet-physik.de/en/magnetizing-technology/magne...)
ROS is irredeemable. I've seen it used in large projects, and the amount of time wasted "wrestling with ros" was ridiculous. I believe this will never change. Just because tensorflow existed meant folks shouldn't have looked for an easier way, and created PyTorch? No! Forge on, intrepid ros-replacers!
I think ROS' biggest benefit is that the "golden path" is well documented and it has a huge community.
I found that if you "stick to the golden path" (monorepo with all your nodes/pkgs + colcon as the build system, deploy to a single supported OS), ROS mostly Just Works. That's a lot of pre-conditions though. But if you deviate a little bit, you're in for a world of pain.
If you try to develop a ROS system as any other C/C++/Python project, there will be some confusing things like always having to source the environment setup file.
Installing it is ... difficult (to say the least) if you're not using the exact version of Ubuntu that is officially supported. One pretty good workaround I found for this is using a VSCode Dev Container for ROS development. I'm then cross compiling to arm64 using Yocto Linux and meta-ros, so I get reliable and reproducible deployments. Once you get over the initial setup pain (which is significant), it's not too bad.
I think things could be significantly improved, but I will also say - I see many students at my uni which build advanced robots with ROS and they probably wouldn't consider themselves expert programmers. A similar effect happens with Arduino. And replicating this "novice-compatibility" is in my opinion the hardest part of replacing ROS.
> always having to source the environment setup file
We are wrapping everything in bash entry points to accomplish this.
> ... build advanced robots with ROS ...
Yes, experiencing this first hand... there are just so many examples out there for integrating with all the various sensors and such.
> I'm then cross compiling to arm64 using Yocto Linux and meta-ros, so I get reliable and reproducible deployments. Once you get over the initial setup pain (which is significant), it's not too bad.
Can we get a call and talk about this? Because I'm comfortable with the concepts you're talking about but we've not sorted out how to make an actually deployable artifact from (for one example) our workspace making use of pymoveit.
EDIT: Saw your email in your profile and sent a message
Sure! I love to talk about this stuff ;) My email is in my profile. Let me know what you're trying to accomplish and we can discuss it.
For context: I'm building a "satellite bus/payload computer" OS based on Yocto Linux for space applications addressing the typical problems people encounter when trying to use Linux for space:
- I've set up my robot/OBC/gadget by copying files into a /home/user and it's running. How do I deploy this and keep track of how the image was built? Some people make an image of the SD card. Other people make a script that customizes the rootfs and creates an image (better!). But we want to have control of the whole stack, since we want to implement secure boot and a few other things.
- I want to update my robot/satellite. Ideally using as little bandwidth as possible. How? Some options: have an A/B system and download a new (delta-)image to the other partition. Or I just upload the changed files. To solve this we use OSTree which lets us have a versioned filesystem with extremely small bsdiff delta updates. It's very satisfying.
> we've not sorted out how to make an actually deployable artifact from (for one example) our workspace making use of pymoveit
Heh, this is the big question for ROS and meta-ros. How do you deploy a workspace? So far we've settled on building the workspace in Yocto and installing it straight into the ROS prefix (typically /opt/ros/ROS_DISTRO).
It's a bit complicated due to the fact that we're also using ROS2 Rust and this recipe should be converted into a .bbclass, but we haven't gotten that far yet.
Well, our brains are closer to spiking neural networks than 'regular' neural networks. And they work pretty well. For the most part.
I feel like SNNs are like Brazil - they are the future, and shall remain so. I think more basic research is needed for them to mature. AFAIK the current SOTA is to train them with 'surrogate gradients', which shoe-horn them into the current NN training paradigm, and that sort of discards some of their worth. Have biologically-inspired learning rules, like STDP, _really_ been exhausted?
If OpenAI or DeepMind makes such claim I'd pay attention. Otherwise it's always some (usually hw) guys trying to get a grant, or even just publish a paper.
p.s. People interested in biologically inspired data processing algorithms should look at Numenta's papers (earlier ones, because recently they switched to regular deep learning), and especially learn their justification for not using spikes.
Basically a simpler version of ros. Easily, and performantly (is that a word?) connect cameras and other sensors, via xtensor/eigen/numpy to whatever algorithms you have, and control actuators/robots.
The Republicans have been in support of nuclear power since before you ever even heard of "climate change." Back then we just called it "clean air." Are you interested in saving the planet, or are you interested in shaming Republicans?
Here's my real nuanced position. Republican's ability to make this happen is somewhat limited. They can fund research, they can work on environmental regulations that stifle nuclear power, they can subsidize the industry with loan guarantees, and they can work with other countries to promote nuclear power globally. They're doing all that. The remaining hurdles are largely social ones, and the liberals largely own the culture. We need Hollywood to make pro-nuclear media. We need movie stars to support it in their dumb speeches at awards ceremonies. This will help stave off some of the NIMBYism. We need the left to tell the Sierra Club and their ilk to stuff it and not sue every attempt to build a nuclear power plant into oblivion... those groups aren't going to listen to Republicans.
It's totally true that the environment is not as high of a priority for Republicans as it is for Democrats. The GOP supports nuclear, but they're not going make it their top priority. As long as the Democrats continue to block it, as long as liberal groups make it tacitly impossible to even break ground on a nuclear plant, the GOP will just keep on happily burning coal. It's fine to think whatever you want about Republicans. I probably won't even disagree with you on most of your opinions. But this is a super easy major win for the environment that is being left on the table because the Democrats aren't willing to play ball.
I'm curious about this as well. It's now #69, 5 minutes later. I seem to recall this happening quite a bit for global warming stories. Anyone from HN want to explain please? Do all 'political' stories get buried?
Dont be ridiculous. There are explanations of the ranking algorithim, and even a cursory glance explains this. Top ranked posts get a couple hundred replies and several hundred likes. This comment will be #20 at my writing, and im already 2 of the others.
The MCP spec allows MCP servers to send back images to clients (base64-encoded, some json schema). However:
1) codex truncates MCP responses, so it will never receive images at all. This bug has been in existence forever.
2) Claude Code CLI will not pass those resulting images through its multi-modal visual understanding. Indeed, it will create an entirely false hallucination if asked to describe said images.
3) No LLM harness can deal with you bouncing your local MCP server. All require you to restart the harness. None allow reconnection to the MCP server.
I assure you there are many other similar bugs, whose presence makes me think that the LLM companies really don't like MCP, and are bugly-deprecating it.
reply