Today, we are excited to announce the launch of a new Transformer notebook on GitHub. AWS DeepComposer gives developers a creative way to get hands-on with the latest generative AI techniques expand their machine learning skills. The Transformer is a state-of-the-art model that works with sequential data such as genomic data or stock prices. By using a mechanism called attention, the algorithm attempts to learn the relationships between the different data points so it can generate better predictions. In the AWS DeepComposer Music studio, the Transformers feature will allow you to iteratively extend a melody to create new and longer compositions. Learn more about the Transformer technique in the learning capsule in the AWS DeepComposer Console.
Amazon Aurora PostgreSQL-Compatible Edition adds support for the pg_bigm extension. pg_bigm extension provides full text search capability in PostgreSQL. This extension allows a user to create 2-gram (bigram) index for faster full text search.