Issue #4 - Friday February 25, 2022
What’s New @ NeuML publishes interesting content covering our open source projects, services and insights. The scheduled frequency is weekly to monthly.
Here we go!
txtai 4.2 was released this week. This release brings further support for YAML-configured applications with the "build once, run anywhere" approach.
The following new functionality is now available:
🐳 Docker images and examples
🔁 Serverless API and workflows support
☁️ Embeddings index cloud storage (AWS S3, Google Cloud Storage, Azure and more)
4.2 also has a number of improvements and bug fixes!
💭 Tutorials
The following new txtai tutorial was published this week.
Build a txtai serverless workflow with AWS SAM
Uses AWS SAM to build a local serverless txtai API instance.
📄 Articles
Here is a list of recent articles and content relevant to our projects we found interesting.
Firecracker MicroVMs
Firecracker runs workloads in lightweight virtual machines, called microVMs, which combine the security and isolation properties provided by hardware virtualization technology with the speed and flexibility of containers. Released by Amazon and used to run AWS Lambda and AWS Fargate.Weave Ignite
Combine Firecracker MicroVMs with Docker images. Highly performant framework that can run over 400 MicroVMs on the same host. Great for hosting serverless applications.Apache Libcloud
Excellent library for abstracting cloud compute resources for a number of providers. txtai uses this library to interface with cloud object storage.
💡 Roadmap
With txtai 4.2 released, upcoming work will continue moving towards the “build once, run anywhere” paradigm. This currently works well for single node txtai embeddings indexes and transformational workflows (i.e. translation, summarization etc).
Upcoming releases will solidify how to best implement multi-node indexes on Kubernetes and Serverless.
🔎 Where to find NeuML
In addition to this newsletter, NeuML can be found in the following places: