Okay, there is nothing spooky about serverless. Though I would argue that it’s pretty scary how much people misrepresent serverless computing.
Day after day I hear people confuse Function-as-a-Service (FaaS) and that just tells me that I have more work to do.
Now you may notice that I have gone down to one post a month. Life has gotten a bit busier so I apologize for that. However, I have some good news. In addition to this newsletter, a podcast will be launching before the end of 2024. More details to come soon.
So moving forward, we will have one monthly newsletter with a few “mini-posts” here an there as well as the podcast. It is our way of finding ways to continue to give you value in multiple formats.
Anyway, let’s talk about some of the top serverless news in October of 2024!
Wiz now supports Serverless Containers for Security Scanning
Wiz is an amazing security company. Container security has always been a challenge and while there are some ways to accomplish this, there haven’t been a lot of third-party companies that specialize in this area. Aqua Security has always been a contender and they do great things but I think it’s important to have competition.
Wiz was founded in 2020 and within 4 years Google offered around $23B for acquisition and Wiz rejected it! That shows not only how fast they grew and the impact they had but also their leaderships’ confidence in their offering. They are pursuing an IPO.
Anyway, it looks like Wiz is now offering security for serverless containers. This is a big deal as it shows that companies like Wiz are seeing serverless containers as a viable option in the marketplace and want to secure them.
Datadog did a Container Report at the top of 2024 and they noted the growth in serverless containers. By September 2023 just under 50% of organizations were leveraging serverless containers.
Cloud security will always be something that is incredibly important. Can you even go a month without hearing about some kind of crazy security breach? I think some people get complacent with security by thinking “well we are on the cloud so clearly we are secure because X provider is securing things”. Security is a shared responsibility model. You are required to secure your applications and services. Tools like Wiz can help.
OpenPR lists a story about the expected surge of serverless security moving foward. By extending to support serverless containers, we will likely see more adoption as more people will be comfortable.
AWS Redshift Serverless is now in AWS GovCloud.
It should be no secret that the government has high expectations of third parties providing cloud services. They take security very seriously and anyone who has worked with the federal government, the RFP process is a pain.
AWS, like every cloud provider, offers a GovCloud which is a special region to better isolate government data. Well now, RedShift Serverless is on AWS GovCloud. Why does this matter? Well as I stated, the government has high standards for cyber security. They will not use anything that doesn’t meet their standards.
AWS, while a competitor to my employer, is a serious enterprise. They are not going to place a product in their GovCloud offering that doesn’t meet the security standard otherwise they risk the future of their business. This tells me that AWS not only made the investment to make RedShift Serverless compliant, but they saw the investment as either necessary (people are asking for it) or worthwhile (people may not be asking for it but will like it when they see it).
Having a serverless data warehouse that the US government sees as trustworthy enough to store data and perform analytics speaks volumes to the growth. Now the offering is new so it is unknown to me at this point how much it will be used, but again, they wouldn’t invest in it if they didn’t at least see potential.
Huawei Cloud talks Cold Starts
Huawei Cloud is not one we hear about often here in the West. In fact, I don’t think they even crack the top 10 cloud providers globally. They are huge in China and have datacenters in something like 33 regions.
They even grew to $7.6B in revenue in 2023. So they are a real contender in the space even though we don’t hear about them often in the USA. Honestly, this is the first time I really heard about Huawei Cloud. I always saw them as a phone and networking device manufacturer.
I was elated to realize that they recently shared some research that they did with serverless runtimes. They actually talk a bit about serverless containers.
The main call out is the “cold start problem”. To recap what this means, when a serverless application scales down, it’s not running at all. So when the URL gets hit by a user or API call, it will take a moment for the service to “wake up” and respond. This is often only milliseconds but can sometimes be a few seconds long.
This is one of the major trade-offs with migrating to serverless. Many people have tried to engineer around it and offer “warm starts”. The service is sort of “awake” but not really so the startup time is much faster.
They don’t provide a ton of alternative options to address it but they do call out the trend. They also go into deeper details about the various contributors to cold-start issues such as runtime language and triggers.
The data and research is on GitHub so I recommend you taking a look.
Vultr gets into the Serverless AI game
Vultr is the world’s largest privately held public cloud. They just announced their serverless inferencing platform. This is a part of the larger “Inference-as-a-Service” that I have mentioned previously.
We are seeing the rise of “Agentic AI”. Generative AI (GenAI) gives you the ability to ask an LLM a question and get a response. However, we aren’t going to ask people to use a CLI or something to do that. They expect a user-friendly interface and chat are the common way to accomplish this.
Well, the code for that bot needs to live somewhere. This is where serverless is PERFECT. The chat bot application scales down to zero when it’s not being used and then kicks on and does what it needs to do to “chat” with the user and access the LLM then scale down.
This also helps with edge computing. Speed is going to be important with chatbots so being able to have that code exist as close to the end user as possible is important. Serverless applications tend to be lightweight which makes them amazing.
Another great thing is that it will come with a TurnKey RAG for customization. After all, the LLM isn’t trained on your data (at least I hope not) but you want it to give relevant information. RAG is the best way to accomplish this.
This report from BusinessWire is a good read on this. What Vultr is doing with Serverless is very impressive.
Closing Thoughts
Serverless is here to stay. That’s just a fact. AI and edge computing have become a sort of tipping point for this.
We are seeing security companies seeing it as a viable option so they want to corner the market for it. We are also seeing the government possibly being interested in using it. It is being used for inferencing with LLMs and other AI models.
We are also seeing companies investing in research and development to address known issues with serverless such as cold-starts. Serverless is growing and is providing a lot of value to developers. But as this Information Week article suggests, we need to think about toolchains and how to best leverage this platform for teams. By having conversations around architecture, we may be able to best address this.
—Photo courtesy Toni Cuenca on Pexels—