There's nothing inherently wrong with lambda/serverless platforms, but I feel like most people are better suited by a regular app server running somewhere "normally".
I'm not going to pretend I'm a domain expert here, but from my own experience I see several common themes: Serverless has the potential to increase complexity and potential problem points. Scaling isn't as simple. Debugging backend issues can be more complicated. Extra latency when bolting together lots of different serverless things to make one cohesive application. Increased vendor lock-in.
I think they're best for certain types of workloads (simple POCs or small side projects with little load can be run virtually free on any serverless platform these days, very bursty/erratic workloads, or workloads run seldomly/ones where a cold start isn't a problem), but not for most use "regular" use-cases.
I'm not going to pretend I'm a domain expert here, but from my own experience I see several common themes: Serverless has the potential to increase complexity and potential problem points. Scaling isn't as simple. Debugging backend issues can be more complicated. Extra latency when bolting together lots of different serverless things to make one cohesive application. Increased vendor lock-in.
I think they're best for certain types of workloads (simple POCs or small side projects with little load can be run virtually free on any serverless platform these days, very bursty/erratic workloads, or workloads run seldomly/ones where a cold start isn't a problem), but not for most use "regular" use-cases.