Category: <span>News</span>

New – Ready-to-use Models and Support for Custom Text and Image Classification Models in Amazon SageMaker Canvas

Today AWS announces new features in Amazon SageMaker Canvas that help business analysts generate insights from thousands of documents, images, and lines of text in minutes with machine learning (ML). Starting today, you can access ready-to-use models and create custom text and image classification models alongside previously supported custom models for tabular data, all without requiring ML experience or writing a line of code.

Business analysts across different industries want to apply AI/ML solutions to generate insights from a variety of data and respond to ad-hoc analysis requests coming from business stakeholders. By applying AI/ML in their workflows, analysts can automate manual, time-consuming, and error-prone processes, such as inspection, classification, as well as extraction of insights from raw data, images, or documents. However, applying AI/ML to business problems requires technical expertise and building custom models can take several weeks or even months.

Launched in 2021, Amazon SageMaker Canvas is a visual, point-and-click service that allows business analysts to use a variety of ready-to-use models or create custom models to generate accurate ML predictions on their own.

Ready-to-use Models
Customers can use SageMaker Canvas to access ready-to-use models that can be used to extract information and generate predictions from thousands of documents, images, and lines of text in minutes. These ready-to-use models include sentiment analysis, language detection, entity extraction, personal information detection, object and text detection in images, expense analysis for invoices and receipts, identity document analysis, and more generalized document and form analysis.

For example, you can select the sentiment analysis ready-to-use model and upload product reviews from social media and customer support tickets to quickly understand how your customers feel about your products. Using the personal information detection ready-to-use model, you can detect and redact personally identifiable information (PII) from emails, support tickets, and documents. Using the expense analysis ready-to-use model, you can easily detect and extract data from your scanned invoices and receipts and generate insights about that data.

These ready-to-use models are powered by AWS AI services, including Amazon Rekognition, Amazon Comprehend, and Amazon Textract.

Ready-to-use models available

Custom Text and Image Classification Models
Customers that need custom models trained for their business-specific use-case can use SageMaker Canvas to create text and image classification models. 

You can use SageMaker Canvas to create custom text classification models to classify data according to your needs. For example, imagine that you work as a business analyst at a company that provides customer support. When a customer support agent engages with a customer, they create a ticket, and they need to record the ticket type, for example, “incident”, “service request”, or “problem”. Many times, this field gets forgotten, and so, when the reporting is done, the data is hard to analyze. Now, using SageMaker Canvas, you can create a custom text classification model, train it with existing customer support ticket information and ticket type, and use it to predict the type of tickets in the future when working on a report with missing data.

You can also use SageMaker Canvas to create custom image classification models using your own image datasets. For instance, imagine you work as a business analyst at a company that manufactures smartphones. As part of your role, you need to prepare reports and respond to questions from business stakeholders related to quality assessment and it’s trends. Every time a phone is assembled, a picture is automatically taken, and at the end of the week, you receive all those images. Now with SageMaker Canvas, you can create a new custom image classification model that is trained to identify common manufacturing defects. Then, every week, you can use the model to analyze the images and predict the quality of the phones produced.

SageMaker Canvas in Action
Let’s imagine that you are a business analyst for an e-commerce company. You have been tasked with understanding the customer sentiment towards all the new products for this season. Your stakeholders require a report that aggregates the results by item category to decide what inventory they should purchase in the following months. For example, they want to know if the new furniture products have received positive sentiment. You have been provided with a spreadsheet containing reviews for the new products, as well as an outdated file that categorizes all the products on your e-commerce platform. However, this file does not yet include the new products.

To solve this problem, you can use SageMaker Canvas. First, you will need to use the sentiment analysis ready-to-use model to understand the sentiment for each review, classifying them as positive, negative, or neutral. Then, you will need to create a custom text classification model that predicts the categories for the new products based on the existing ones.

Ready-to-use Model – Sentiment Analysis
To quickly learn the sentiment of each review, you can do a bulk update of the product reviews and generate a file with all the sentiment predictions.

To get started, locate Sentiment analysis on the Ready-to-use models page, and under Batch prediction, select Import new dataset.

Using ready-to-use sentiment analysis with a batch dataset

When you create a new dataset, you can upload the dataset from your local machine or use Amazon Simple Storage Service (Amazon S3). For this demo, you will upload the file locally. You can find all the product reviews used in this example in the Amazon Customer Reviews dataset.

After you complete uploading the file and creating the dataset, you can Generate predictions.

Select dataset and generate predictions

The prediction generation takes less than a minute, depending on the size of the dataset, and then you can view or download the results.

View or download predictions

The results from this prediction can be downloaded as a .csv file or viewed from the SageMaker Canvas interface. You can see the sentiment for each of the product reviews.

Preview results from ready-to-use model

Now you have the first part of your task ready—you have a .csv file with the sentiment of each review. The next step is to classify those products into categories.

Custom Text Classification Model
To classify the new products into categories based on the product title, you need to train a new text classification model in SageMaker Canvas.

In SageMaker Canvas, create a New model of the type Text analysis.

The first step when creating the model is to select a dataset with which to train the model. You will train this model with a dataset from last season, which contains all the products except for the new collection.

Once the dataset has finished importing, you will need to select the column that contains the data you want to predict, which in this case is the product_category column, and the column that will be used as the input for the model to make predictions, which is the product_title column.

After you finish configuring that, you can start to build the model. There are two modes of building:

  • Quick build that returns a model in 15–30 minutes.
  • Standard build takes 2–5 hours to complete.

To learn more about the differences between the modes of building you can check the documentation. For this demo, pick quick build, as our dataset is smaller than 50,000 rows.

Prepare and build your model

When the model is built, you can analyze how the model performs. SageMaker Canvas uses the 80-20 approach; it trains the model with 80 percent of the data from the dataset and uses 20 percent of the data to validate the model.

Model score

When the model finishes building, you can check the model score. The scoring section gives you a visual sense of how accurate the predictions were for each category. You can learn more about how to evaluate your model’s performance in the documentation.

After you make sure that your model has a high prediction rate, you can move on to generate predictions. This step is similar to the ready-to-use models for sentiment analysis. You can make a prediction on a single product or on a set of products. For a batch prediction, you need to select a dataset and let the model generate the predictions. For this example, you will select the same dataset that you selected in the ready-to-use model, the one with the reviews. This can take a few minutes, depending on the number of products in the dataset.

When the predictions are ready, you can download the results as a .csv file or view how each product was classified. In the prediction results, each product is assigned only one category based on the categories provided during the model-building process.

Predict categories

Now you have all the necessary resources to conduct an analysis and evaluate the performance of each product category with the new collection based on customer reviews. Using SageMaker Canvas, you were able to access a ready-to-use model and create a custom text classification model without having to write a single line of code.

Available Now
Ready-to-use models and support for custom text and image classification models in SageMaker Canvas are available in all AWS Regions where SageMaker Canvas is available. You can learn more about the new features and how they are priced by visiting the SageMaker Canvas product detail page.

— Marcia

AWS Application Migration Service Major Updates: Import and Export Feature, Source Server Migration Metrics Dashboard, and Additional Post-Launch Actions

AWS Application Migration Service (AWS MGN) can simplify and expedite your migration to AWS by automatically converting your source servers from physical, virtual, or cloud infrastructure to run natively on AWS. In the post, How to Use the New AWS Application Migration Server for Lift-and-Shift Migrations, Channy introduced us to Application Migration Service and how to get started.

By using Application Migration Service for migration, you can minimize time-intensive, error-prone manual processes by automating replication and conversion of your source servers from physical, virtual, or cloud infrastructure to run natively on AWS. Last year, we introduced major improvements such as new migration servers grouping, an account-level launch template, and a post-launch actions template.

Today, I’m pleased to announce three major updates of Application Migration Service. Here’s the quick summary for each feature release:

  • Import and export – You can now use Application Migration Service to import your source environment inventory list to the service from a CSV file. You can also export your source server inventory for reporting purposes, offline reviews and updates, integration with other tools and AWS services, and performing bulk configuration changes by reimporting the inventory list.
  • Server migration metrics dashboard – This new dashboard can help simplify migration project management by providing an aggregated view of the migration lifecycle status of your source servers
  • Additional post-launch modernization actions – In this update, Application Migration Service added eight additional predefined post-launch actions. These actions are applied to your migrated applications when you launch them on AWS.

Let me share how you can use these features for your migration.

Import and Export
Before we go further into the import and export features, let’s discuss two concepts within Application Migration Service: applications and waves, which you can define when migrating with Application Migration Service. Applications represent a group of servers. By using applications, you can define groups of servers and identify them as an application. Within your application, you can perform various activities with Application Migration Service, such as monitoring, specifying tags, and performing bulk operations, for example, launching test instances. Additionally, you can group your applications into waves, which represent a group of servers that are migrated together, as part of your migration plan.

With the import feature, you can now import your inventory list in CSV form into Application Migration Service. This makes it easy for you to manage large scale-migrations, and ingest your inventory of source servers, applications and waves, including their attributes.

To start using the import feature, I need to identify my servers and application inventory. I can do this manually, or using discovery tools. The next thing I need to do is download the import template which I can access from the console. 

After I downloaded the import template, I can start mapping from my inventory list into this template. While mapping my inventory, I can group related servers into applications and waves. I can also perform configurations, such as defining Amazon Elastic Compute Cloud (Amazon EC2) launch template settings, and specifying tags for each wave.

The following screenshot is an example of the results of my import template:

The next step is to upload my CSV file to an Amazon Simple Storage Service (Amazon S3) bucket. Then, I can start the import process from the Application Migration Service console by referencing the CSV file containing my inventory list that I’ve uploaded to the S3 bucket.

When the import process is complete, I can see the details of the import results.

I can import inventory for servers that don’t have an agent installed, or haven’t yet been discovered by agentless replication. However, to replicate data, I need to use agentless replication, or install the AWS Replication Agent on my source servers.

Now I can view all my inventory inside the Source servers, Applications and Waves pages on the Application Migration Service console. The following is a screenshot for recently imported waves.

In addition, with the export feature, I can export my source servers, applications, and waves along with all configurations that I’ve defined into a CSV file.

This is helpful if you want to do reporting or offline reviews, or for bulk editing before reimporting the CSV file into Application Migration Service.

Server Migration Metrics Dashboard
We previously supported a migration metrics dashboard for applications and waves. In this release, we have specifically added a migration metrics dashboard for servers. Now you can view aggregated overviews of your server’s migration process on the Application Migration Service dashboard. Three topics are available in the migration metrics dashboard:

  • Alerts – Shows associated alerts for respective servers.
  • Data replication status – Shows the replication data overview status for source servers. Here, you get a quick overview of the lifecycle status of the replication data process.
  • Migration lifecycle – Shows an overview of the migration lifecycle from source servers.

Additional Predefined Post-launch Modernization Actions
Post-launch actions allow you to control and automate actions performed after your servers have been launched in AWS. You can use predefined or use custom post-launch actions.

Application Migration Service now has eight additional predefined post-launch actions to run in your EC2 instances on top of the existing four predefined post-launch actions. These additional post-launch actions provide you with flexibility to maximize your migration experience.

The new additional predefined post-launch actions are as follows:

  • Convert MS-SQL license – You can easily convert Windows MS-SQL BYOL to an AWS license using the Windows MS-SQL license conversion action. The launch process includes checking the SQL edition (Enterprise, Standard, or Web) and using the right AMI with the right billing code.
  • Create AMI from instance – You can create a new Amazon Machine Image (AMI) from your Application Migration Service launched instance.
  • Upgrade Windows version – This feature allows you to easily upgrade your migrated server to Windows Server 2012 R2, 2016, 2019, or 2022. You can see the full list of available OS versions on AWSEC2-CloneInstanceAndUpgradeWindows page.
  • Conduct EC2 connectivity checks – You can conduct network connectivity checks to a predefined list of ports and hosts using the EC2 connectivity check feature.
  • Validate volume integrity – You can use this feature to ensure that Amazon Elastic Block Store (Amazon EBS) volumes on the launched instance are the same size as the source, properly mounted on the EC2 instance, and accessible.
  • Verify process status – You can validate the process status to ensure that processes are in a running state after instance launch. You will need to provide a list of processes that you want to verify and specify how long the service should wait before testing begins. This feature lets you do the needed validations automatically and saves time by not having to do them manually.
  • CloudWatch agent installation – Use the Amazon CloudWatch agent installation feature to install and set up the CloudWatch agent and Application Insights features.
  • Join Directory Service domain – You can simplify the AWS join domain process by using this feature. If you choose to activate this action, your instance will be managed by the AWS Cloud Directory (instead of on premises).

Things to Know
Keep in mind the following:

  • Updated UI/UX – We have updated the user interface with card layout and table layout view for the action list on the Application Migration Service console. This update helps you to determine which post-launch actions are suitable for your use case . We have also added filter options to make it easy to find relevant actions by operating system, category, and more.
  • Support for additional OS versions – Application Migration Service now supports CentOS 5.5 and later and Red Hat Enterprise Linux (RHEL) 5.5 and later operating systems.
  • Availability – These features are available now, and you can start using them today in all Regions where Application Migration Service is supported.

Get Started Today

Visit the Application Migration Service User Guide page to learn more about these features and understand the pricing. You can also visit Getting started with AWS Application Migration Service to learn more about how to get started to migrate your workloads.

Happy migrating!

Donnie

AWS Week in Review – March 13, 2023

It seems like only yesterday I was last writing the Week in Review post, at the end of January, and now here we are almost mid-way through March, almost into spring in the northern hemisphere, and close to a quarter way through 2023. Where does time fly?!

Last Week’s Launches
Here’s some of the launches and other news from the past week that I want to bring to your attention:

New AWS Heroes: At the center of the AWS Community around the globe, Heroes share their knowledge and enthusiasm. Welcome to Ananda in Indonesia, and Aidan and Wendy in Australia, our newly announced Heroes!

General Availability of AWS Application Composer: Launched in preview during Dr. Werner Vogel’s re:Invent 2022 keynote, AWS Application Composer is a tool enabling the composition and configuration of serverless applications using a visual design surface. The visual design is backed by an AWS CloudFormation template, making it deployment ready.

What I find particularly cool about Application Composer is that it also works on existing serverless application templates, and round-trips changes to the template made in either a code editor or the visual designer. This makes it ideal for both new developers, and experienced serverless developers with existing applications.

My colleague Channy’s post provides an introduction, and Application Composer is also featured in last Friday’s AWS on Air show, available to watch on-demand.

Get daily feature updates via Amazon SNS: One thing I’ve learned since joining AWS is that the service teams don’t stand still, and are releasing something new pretty much every day. Sometimes, multiple things! This can, however, make it hard to keep up. So, I was interested to read that you can now receive daily feature updates, in email, by subscribing to an Amazon Simple Notification Service (Amazon SNS) topic. As usual, Jeff’s post has all the details you need to get started.

Using up to 10GB of ephemeral storage for AWS Lambda functions: If you use Lambda for Extract-Transform-Load (ETL) jobs, or any data-intensive jobs that require temporary storage of data during processing, you can now configure up to 10GB of ephemeral storage, mounted at /tmp, for your functions in six additional Regions – Asia Pacific (Hyderabad), Asia Pacific (Jakarta), Asia Pacific (Melbourne), Europe (Spain), Europe (Zurich), and Middle East (UAE). More information on using ephemeral storage with Lambda functions can be found in this blog post.

Increased table counts for Amazon Redshift: workloads that require large numbers of tables can now take advantage of using up to 200K tables, avoiding the need to split tables across multiple data warehouses. The updated limit is available to workloads using the ra3.4xlarge, ra3.16xlarge, and dc2.8xlarge node types with Redshift Serverless and data warehouse clusters.

Faster, simpler permissions setup for AWS Glue: Glue is a serverless data integration and ETL service for discovering, preparing, moving, and integrating data intended for use in analytics and machine learning (ML) workloads. A new guided permissions setup process, available in the AWS Management Console, makes it simpler and easier to grant access to AWS Identity and Access Management (IAM) Roles and users to Glue, and use a default role for running jobs and working with notebooks. This simpler, guided approach helps users start authoring jobs, and work with the Data Catalog, without further setup. 

Microsoft Active Directory authentication for the MySQL-Compatible Edition of Amazon Aurora: You can now use Active Directory, either with an existing on-premises directory or with AWS Directory Service for Microsoft Active Directory, to authenticate database users when accessing Amazon Aurora MySQL-Compatible Edition instances, helping reduce operational overhead. It also enables you to make use of native Active Directory credential management capabilities to manage password complexities and rotation, helping you stay in step with your compliance and security requirements.

Launch of the 2023 AWS DeepRacer League and new competition structure: The DeepRacer tracks are one of my favorite things to visit and watch at AWS events, so I was happy to learn the new 2023 league is now underway. If you’ve not heard of DeepRacer, it’s the world’s first global racing league featuring autonomous vehicles, enabling developers of all skill levels to not only compete to complete the track in the shortest time but also to advance their knowledge of machine learning (ML) in the process. Along with the new league, there are now more chances to earn achievements and prizes using an all new three-tier competition spanning national and regional races. Racers compete for a chance to win a spot in the World Championship, held at AWS re:Invent, and a $43,000 prize purse. What are you waiting for, start your (ML) engines today!

AWS open-source news and updates: The latest newsletter highlighting open-source projects, tools, and demos from the AWS Community is now available. The newsletter is published weekly, and you can find edition 148 here.

For a full list of AWS announcements, be sure to keep an eye on the What’s New at AWS page.

Upcoming AWS Events
Here’s some upcoming events you may be interested in checking out:

AWS Pi Day: March 14th is the third annual AWS Pi Day. Join in with the celebrations of the 17th birthday of Amazon Simple Storage Service (Amazon S3) and the cloud in a live virtual event hosted on the AWS on Air channel. There’ll also be news and discussions on the latest innovations across Data services on AWS, including storage, analytics, AI/ML, and more.

.NET developers and architects looking to modernize their applications will be interested in an upcoming webinar, Modernize and Optimize by Containerizing .NET Applications on AWS, scheduled for March 22nd. In this webinar, you’ll find demonstrations on how you can enhance the security of legacy .NET applications through modernizing to containers, update to a modern version of .NET, and run them on the latest versions of Windows. Registration for the online event is open now.

You can find details on all upcoming events, in-person and virtual, here.

New Livestream Shows
There’s some new livestream shows that launched recently I’d like to bring to your attention:

My colleague Isaac has started a new .NET on AWS show, streaming on Twitch. The second episode was live last week; catch up here on demand. Episode 1 is also available here.

I mentioned AWS on Air earlier in this post, and hopefully you’ve caught our weekly Friday show streaming on Twitch, Twitter, YouTube, and LinkedIn. Or, maybe you’ve seen us broadcasting live from AWS events such as Summits or AWS re:Invent. But did you know that some of the hosts of the shows have recently started their own individual shows too? Check out these new shows below:

  • AWS on Air: Startup! – hosted by Jillian Forde, this show focuses on learning technical and business strategies from startup experts to build and scale your startup in AWS. The show runs every Tuesday at 10am PT/1pm ET.
  • AWS On Air: Under the Hood with AWS – in this show, host Art Baudo chats with guests, including AWS technical leaders and customers, about Cloud infrastructure. In last week’s show, the discussion centered around Amazon Elastic Compute Cloud (Amazon EC2) Mac Instances. Watch live every Tuesday at 2pm PT/5pm ET. 
  • AWS on Air: Lockdown! – join host Kyle Dickinson each Tuesday at 11am PT/2pm ET for this show, covering a breadth of AWS security topics in an approachable way that’s suitable for all levels of AWS experience. You’ll encounter demos, guest speakers from AWS, AWS Heroes, and AWS Community Builders. 
  • AWS on Air: Step up your GameDay – hosts AM Grobelny and James Spencer are joined by special guests to strategize and navigate through an AWS GameDay, a fun and challenge-oriented way to learn about AWS. You’ll find this show every second Wednesday at 11am PT/2pm ET.
  • AWS on Air: AMster & the Brit’s Code Corner – join AM Grobelny and myself as we chat about and illustrate cloud development. In Beginners Corner, we answer your questions and try to demystify this strange thing called “coding”, and in Project Corner we tackle slightly larger projects of interest to more experienced developers. There’s something for everyone in Code Corner, live on the 3rd Thursday of each month at 11am PT/2pm ET.

You’ll find all these AWS on Air shows in the published schedule. We hope you can join us!

That’s all for this week – check back next Monday for another AWS Week in Review.

This post is part of our Week in Review series. Check back each week for a quick roundup of interesting news and announcements from AWS!

AWS Application Composer Now Generally Available – Visually Build Serverless Applications Quickly

At AWS re:Invent 2022, we previewed AWS Application Composer, a visual builder for you to compose and configure serverless applications from AWS services backed by deployment-ready infrastructure as code (IaC).

In the keynote, Dr. Werner Vogels, CTO of Amazon.com said:

Developers that never used serverless before. How do they know where to start? Which services do they need? How do they work together? We really wanted to make this easier. AWS Application Composer simplifies and accelerates the architecting, configuring, and building of serverless applications.

During the preview, we had lots of interest and great feedback from customers. Today, I am happy to announce the general availability of AWS Application Composer with new improvements based on customer feedback. I want to quickly review its features and introduce some improvements.

Introduction to AWS Application Composer
To get started with AWS Application Composer, choose Open demo in the AWS Management Console. This demo shows a simple cart application with Amazon API Gateway, AWS Lambda, and Amazon DynamoDB resources.

You can easily browse and search for AWS services in the left Resources panel and drag and drop them onto the canvas to expand your architecture.

In the middle Canvas panel, you can connect resources together by clicking and dragging from one resource port to another. Permissions are automatically composed for these resources to interact with each other using policy template, environment variables, and event subscriptions. Grouping resources is very useful to select one visual organization. For above example, API Compute group is compsite of Lambda functions. When you double-click on a specific resource, you can name and configure your properties in the right Resource properties panel.

As well as featured resources available in the visual resource palette, you can use hidden and read-only resources will populate on the canvas when you load an existing template that includes them.

In this example, the MyHttpApi resource is a hidden resource. It is not available from the resource palette but does appear on the canvas in color. The resource named MyHttpApiRole (in this case, an AWS::IAM::Role resource) is read-only. It grayed out on the canvas greyed out. To learn more about all supported resources, see AWS Application Composer featured resources in the AWS documentation.

When you select the Template menu, you can view, edit or manually download your IaC, such as AWS Serverless Application Model (AWS SAM). Your changes are automatically synced with your canvas.

When you start Connected mode, you can use Application Composer with local tools such as an integrated development environment (IDE). Any changes activate the automatic synchronization of your project template and files between Application Composer and your local project directory.

It is useful to incorporate into your existing team processes, such as local testing with AWS SAM Command Line Interface (CLI), peer review through version control, or deployment through AWS CloudFormation and continuous integration and delivery (CI/CD) pipelines.

This mode is supported on Chrome and Edge browsers and requires you to grant temporary local file system access to your browser.

AWS Application Composer can be used in real-world scenarios such as:

  • Building a prototype of serverless applications
  • Reviewing and collaboratively evolving existing serverless projects
  • Generating diagrams for documentation or Wikis
  • Onboarding new team members to a project
  • Reducing the first steps to deploy something in an AWS account

To learn more real-world examples, see Visualize and create your serverless workloads with AWS Application Composer in the AWS Compute Blog, How I Used AWS Application Composer to Make Analyzing My Meetup Data Easy in BuildOn.AWS, or watch a breakout session video (SVS211) from AWS re:Invent 2022.

Improvements Since Preview Launch
Here is a new feature to improve how you work with Amazon Simple Queue Service (Amazon SQS) queues.

You can now directly connect Amazon API Gateway resources to Amazon SQS without routing requests through AWS Lambda function. You can remove the complexity of the Lambda function’s execution and increase the reliability while reducing lines of code.

For example, you can drag API Gateway and Amazon SQS onto the canvas and connect the two resources. When the user drags the connector from API route to SQS, Send message appears. You can connect the API route to the SQS queue via their choice of integration target.

The new Change Inspector provides a visual diff of template changes made when you connect two resources on the canvas. This information is available as a notification when you make the connection, which helps you understand how Composer manages integration configuration in your IaC template as you build.

Here are some more improvements to your experience in the user interface!

First, we reduced the size of resource cards. The larger cards made it difficult for the users to read and view their template on the canvas. Now, you can arrange more resource cards easily and save space on the canvas.

Also, we added zoom in and out and zoom to fit buttons so that users can quickly view the entire screen or zoom to the desired level. When you load a large template onto the canvas, you can easily see all the resource cards in any size.

Now Available
AWS Application Composer is now generally available in the US East (Ohio), US East (N. Virginia), US West (Oregon), Asia Pacific (Singapore), Asia Pacific (Sydney), Asia Pacific (Tokyo), Europe (Frankfurt), Europe (Ireland), and Europe (Stockholm) Regions, adding three more Regions to the six Regions available during preview. There is no additional cost, and you can start using it today.

To learn more, see the AWS Application Composer Developer Guide and send feedback to AWS re:Post for AWS Application Composer or through your usual AWS support contacts.

Channy

Subscribe to AWS Daily Feature Updates via Amazon SNS

Way back in 2015 I showed you how to Subscribe to AWS Public IP Address Changes via Amazon SNS. Today I am happy to tell you that you can now receive timely, detailed information about releases and updates to AWS via the same, simple mechanism.

Daily Feature Updates
Simply subscribe to topic arn:aws:sns:us-east-1:692768080016:aws-new-feature-updates using the email protocol and confirm the subscription in the usual way:

You will receive daily emails that start off like this, with an introduction and a summary of the update:

After the introduction, the email contains a JSON representation of the daily feature updates:

As noted in the message, the JSON content is also available online at URLs that look like https://aws-new-features.s3.us-east-1.amazonaws.com/update/2023-02-27.json . You can also edit the date in the URL to access historical data going back up to six months.

The email message also includes detailed information about changes and additions to managed policies that will be of particular interest to AWS customers who currently manually track and then verify the impact that these changes may have on their security profile. Here’s a sample list of changes (additional permissions) to existing managed policies:

And here’s a new managed policy:

Even More Information
The header of the email contains a link to a treasure trove of additional information. Here are some examples:

AWS Regions and AWS Services – A pair of tables. The first one includes a row for each AWS Region and a column for each service, and the second one contains the transposed version:

AWS Regions and EC2 Instance Types – Again, a pair of tables. The first one includes a row for each AWS Region and a column for each EC2 instance type, and the second one contains the transposed version:

The EC2 Instance Types Configuration link leads to detailed information about each instance type:

Each page also includes a link to the same information in JSON form. For example (EC2 Instance Types Configuration), starts like this:

{
    "a1.2xlarge": {
        "af-south-1": "-",
        "ap-east-1": "-",
        "ap-northeast-1": "a1.2xlarge",
        "ap-northeast-2": "-",
        "ap-northeast-3": "-",
        "ap-south-1": "a1.2xlarge",
        "ap-south-2": "-",
        "ap-southeast-1": "a1.2xlarge",
        "ap-southeast-2": "a1.2xlarge",
        "ap-southeast-3": "-",
        "ap-southeast-4": "-",
        "ca-central-1": "-",
        "eu-central-1": "a1.2xlarge",
        "eu-central-2": "-",
        "eu-north-1": "-",
        "eu-south-1": "-",
        "eu-south-2": "-",
        "eu-west-1": "a1.2xlarge",
        "eu-west-2": "-",
        "eu-west-3": "-",
        "me-central-1": "-",
        "me-south-1": "-",
        "sa-east-1": "-",
        "us-east-1": "a1.2xlarge",
        "us-east-2": "a1.2xlarge",
        "us-gov-east-1": "-",
        "us-gov-west-1": "-",
        "us-west-1": "-",
        "us-west-2": "a1.2xlarge"
    },

Other information includes:

  • VPC Endpoints
  • AWS Services Integrated with Service Quotas
  • Amazon SageMaker Instance Types
  • RDS DB Engine Versions
  • Amazon Nimble Instance Types
  • Amazon MSK Apache Kafka Versions

Information Sources
The information is pulled from multiple public sources, cross-checked, and then issued. Here are some of the things that we look for:

Things to Know
Here are a couple of things that you should keep in mind about the AWS Daily Feature Updates:

Content – The content provided in the Daily Feature Updates and in the treasure trove of additional information will continue to grow as new features are added to AWS.

Region Coverage – The Daily Feature Updates cover all AWS Regions in the public partition. Where possible, it also provides information about GovCloud regions; this currently includes EC2 Instance Types, SageMaker Instance Types, and Amazon Nimble Instance Types.

Region Mappings – The internal data that drives all of the information related to AWS Regions is updated once a day if there are applicable new features, and also when new AWS Regions are enabled.

Updates – On days when there are no updates, there will not be an email notification.

Usage – Similar to the updates on the What’s New page and the associated RSS feed, the updates are provided for informational purposes, and you still need to do your own evaluation and testing before deploying to production.

Command Line Subscription – If you have access to the AWS Command Line Interface (AWS CLI), you can subscribe from the command line:

$ aws sns subscribe --topic-arn arn:aws:sns:us-east-1:692768080016:aws-new-feature-updates --protocol email --notification-endpoint [email protected]
{
    "SubscriptionArn": "pending confirmation"
}

Jeff

AWS Week in Review – March 6, 2023

It has been a week full of interesting launches and I am thrilled to be able to share them with you today. We’ve got a new region in the works, a new tool for researchers, updates to Amazon Timestream, Control Tower, and Amazon Inspector, Lambda Powertools for .NET, existing services in new locations, lots of posts from other AWS blogs, upcoming events, and more.

Last Week’s Launches
Here are some of the launches that caught my eye this past week:

AWS Region in Malaysia – We are working on an AWS Region in Malaysia, bringing the number of regions that are currently in the works to five. The upcoming region will include three Availability Zones, and represents our commitment to invest at least $6 Billion in Malaysia by 2037. You can read my post to learn about how our enterprise, startup, and public sector customers are already using AWS.

Amazon Lightsail for Research – You can get access to analytical applications such as Scilab, RStudio, and Jupyter with just a couple of couple of clicks. Instead of processing large data sets on your laptop, you can get to work quickly without having to deal with hardware setup, software setup, or tech support.

Batch Loading of Data into Amazon Timestream – You can now batch-load time series data into Amazon Timestream. You upload the data to an Amazon Simple Storage Service (Amazon S3) bucket in CSV form, specify a target database and table, and a data model. The ingestion happens automatically and reliably, with parallel processes at work for efficiency.

Control Tower Progress TrackerAWS Control Tower now includes a progress tracker that shows you the milestones (and their status) of the landing zone setup and upgrade process. Milestones such as updating shared accounts for logging, configuring Account Factory, and enabling mandatory controls are tracked so that you have additional visibility into the status of your setup or upgrade process.

Kinesis Data Streams Throughput Increase – Each Amazon Kinesis Data Stream now supports up to 1 GB/second of write throughput and 2 GB/second of read throughput, both in On-Demand capacity mode. To reach this level of throughput for your data streams you will need to submit a Support Ticket, as described in the What’s New.

Lambda Powertools for .NET – This open source developer library is now generally available. It helps you to incorporate Well-Architected serverless best practices into your code, with a focus on observability features including distributed tracing, structured logging, and asynchronous metrics (both business and applications).

Amazon Inspector Code Scans for Lambda Functions – This preview launch gives Amazon Inspector the power to scan your AWS Lambda functions for vulnerabilities such as injection flaws, data leaks, weak cryptography, or missing encryption. Findings are aggregated in the Amazon Inspector console, routed to AWS Security Hub, and pushed to Amazon EventBridge.

X in Y – We made existing services and features available in additional regions and locations:

For a full list of AWS announcements, take a look at the What’s New at AWS page, and consider subscribing to the page’s RSS feed.

Interesting Blog Posts

Other AWS Blogs – Here are some fresh posts from a few of the other AWS Blogs:

AWS Open Source – My colleague Ricardo writes a weekly newsletter to highlight new open source projects, tools, and demos from the AWS Community. Read edition #147 to learn more.

Upcoming AWS Events
Check your calendar and be sure to attend these upcoming events:

AWSome Women Community Summit LATAM 2023 – Organized by members of the woman-led AWS communities in Perú, Chile, Argentina, Guatemala, Colombia, this event will take place in Bogotá, Colombia with an online option as well.

AWS Pi Day 2023 SmallAWS Pi Day – Join us on March 14th for the third annual AWS Pi Day live, virtual event hosted on the AWS On Air channel on Twitch as we celebrate the 17th birthday of Amazon S3 and the cloud.

We will discuss the latest innovations across AWS Data services, from storage to analytics and AI/ML. If you are curious about how AI can transform your business, register here and join my session.

AWS Innovate Data and AI/ML edition – AWS Innovate is a free online event to learn the latest from AWS experts and get step-by-step guidance on using AI/ML to drive fast, efficient, and measurable results. Register now for EMEA (March 9) and the Americas (March 14th).

You can browse all upcoming AWS-led in-person, virtual events and developer focused events such as Community Days.

And that’s all for today!

Jeff

In the Works – AWS Region in Malaysia

We launched an AWS Region in Australia earlier this year, four more (Switzerland, Spain, the United Arab Emirates, and India) in 2022, and are working on regions in Canada, Israel, New Zealand, and Thailand. All told, we now have 99 Availability Zones spread across 31 geographic regions.

Malaysia in the Works
Today I am happy to announce that we are working on an AWS region in Malaysia. This region will give AWS customers the ability to run workloads and store data that must remain in-country.

The region will include three Availability Zones (AZs), each one physically independent of the others in the region yet far enough apart to minimize the risk that an AZ-level event will have on business continuity. The AZs will be connected to each other by high-bandwidth, low-latency network connections over dedicated, fully-redundant fiber.

AWS in Malaysia
We are planning to invest at least $6 Billion (25.5 billion Malaysian ringgit) in Malaysia by 2037.

Many organizations in Malaysia are already making use of the existing AWS Regions. This includes enterprise and public sector organizations such as Axiata Group, Baba Products, Bank Islam Malaysia, Celcom Digi, PayNet, PETRONAS, Tenaga Nasional Berhad (TNB), Asia Pacific University of Technology & Innovation, Cybersecurity Malaysia, Department of Statistics Malaysia, Ministry of Higher Education Malaysia, and Pos Malaysia, and startups like Baba’s, BeEDucation Adventures, CARSOME, and StoreHub.

Here’s a small sample of some of the exciting and innovative work that our customers are doing in Malaysia:

Johor Corporation (JCorp) is the principal development institution that drives the growth of the state of Johor’s economy through its operations in the agribusiness, wellness, food and restaurants, and real estate and infrastructure sectors. To power JCorp’s digital transformation and achieve the JCorp 3.0 reinvention plan goals, the company is leveraging the AWS cloud to manage its data and applications, serving as a single source of truth for its business and operational knowledge, and paving the way for the company to tap on artificial intelligence, machine learning and blockchain technologies in the future.

Radio Televisyen Malaysia (RTM), established in 1946, is the national public broadcaster of Malaysia, bringing news, information, and entertainment programs through its six free-to-air channels and 34 radio stations to millions of Malaysians daily. Bringing cutting-edge AWS technologies closer to RTM in Malaysia will accelerate the time it takes to develop new media services, while delivering a better viewer experience with lower latency.

Bank Islam, Malaysia’s first listed Islamic banking institution, provides end-to-end financial solutions that meet the diverse needs of their customers. The bank taps AWS’ expertise to power its digital transformation and the development of Be U digital bank through its Centre of Digital Experience, a stand-alone division that creates cutting-edge financial services on AWS to enhance customer experiences.

Malaysian Administrative Modernization Management Planning Unit (MAMPU) encourages public sector agencies to adopt cloud in all ICT projects in order to accelerate emerging technologies application and increase the efficiency of public service. MAMPU believes the establishment of the AWS Region in Malaysia will further accelerate digitalization of the public sector, and bolster efforts for public sector agencies to deliver advanced citizen services seamlessly.

Malaysia is also home to both Independent Software Vendors (ISVs) and Systems Integrators that are members of the AWS Partner Network (APN). The ISV partners build innovative solutions on AWS and the SIs provide business, technical, marketing, and go-to-market support to customers. AWS Partners based in Malaysia include Axrail, eCloudvalley, Exabytes, G-AsiaPacific, GHL, Maxis, Radmik Solutions Sdn Bhd, Silverlake, Tapway, Fourtitude, and Wavelet.

New Explainer Video
To learn more about our global infrastructure, be sure to watch our new AWS Global Infrastructure Explainer video:

Stay Tuned
As usual, subscribe to this blog so that you will be among the first to know when the new region is open!

Jeff

Tis Tech and Wazuh sign a partnership agreement

San Jose, California, December 2022. We are glad to announce that Tis Tech has signed a partnership agreement with Wazuh. Tis Tech core business is focused on the Information and Communication Technologies (ICT) sector, and its main objective is to deliver solutions for its customer’s businesses.

Among the business consulting services offered by Tis Tech, we can mention the following: Project Management, BPM, Process Integrity, Systems and Information Security, Application Development, ERP and CRM, Change Management, Training, and Infrastructure.

Tis Tech works for companies worldwide, and its main clients are public sector institutions, oil & gas industries, financial institutions, private organizations, telecoms, and utility companies.

“We have chosen to work with Wazuh because they are one of the best in their field. They developed a great open-source software, which captivated a large crowd to follow the brand and cultivated a wonderful support community. The software is easy-to-use and customizable, requires no changes to the infrastructure, and is cloud-based, which means it can be deployed to all our office branches without additional equipment cost. Above all, Wazuh provides excellent technical assistance and has been extremely helpful throughout this partnership. We are excited to be working with Wazuh and foresee a great future together.” commented Ivo Domingos, Systems Analyst at Tis Tech.

As a multicultural organization, Tis Tech has an international presence, with modern and functional facilities at their Angolan headquarters and their global representation points located in Brazil, Argentina, India, Portugal, China, and Mozambique.

“We at Wazuh are delighted that Tis Tech has chosen to work with Wazuh and that they appreciate the benefits of our open-source platform. Tis Tech has told us that they are satisfied with our technical support and have found it very helpful since the beginning of our partnership,” states Alberto Gonzalez, COO at Wazuh.

If you want to learn more about Tis Tech, please visit its official website. For more information on Wazuh Partnerships, please visit our partners’ page.

The post Tis Tech and Wazuh sign a partnership agreement appeared first on Wazuh.