Q369. A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data. Sequencing data is ge

欢迎免费使用小程序搜题/刷题/查看解析,提升学历,成考自考报名,论文代写、论文查重请加客服微信skr-web


Q369. A life sciences company is using a combination of open source tools to manage data analysis workflows and Docker containers running on servers in its on-premises data center to process genomics data. Sequencing data is generated and stored on a local storage area network (SAN) and then the data is processed. The research and development teams are running into capacity issues and have decided to re-architect their genomics analysis platform on AWS to scale based on workload demands and reduce the turnaround time from weeks to days.The company has a high-speed AWS Direct Connect connection. Sequencers will generate around 200 GB of data for each genome;and individual jobs can take several hours to process the data with ideal compute capacity. The end result will be stored in Amazon S3. The company is expecting 10-15 job requests each day.Which solution meets these requirements?

A.Use regularly scheduled AWS Snowball Edge devices to transfer the sequencing data into AWS. When AWS receives the Snowball Edge device and the data is loaded into Amazon S3 use S3 events to trigger an AWS Lambda function to process the data.
B.Use AWS Data Pipeline to transfer the sequencing data to Amazon S3. Use S3 events to trigger an Amazon EC2 Auto Scaling group to launch custom-AMl EC2 instances running the Docker containers to process the data.
C.Use AWS DataSync to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Lambda function that starts an AWS Step Functions workflow. Store the Docker images in Amazon Elastic Container Registry
(Amazon ECR) and trigger AWS Batch to run the container and process the sequencing data. D.Use an AWS Storage Gateway file gateway to transfer the sequencing data to Amazon S3. Use S3 events to trigger an AWS Batch job that executes on Amazon EC2 instances running the Docker containers to process the data.
正确答案C
访客
邮箱
网址

通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作


  • 自动写文案
  • 自动写小说
  • 马上扫码让Ai帮你完成工作
通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作

  • 自动写论文
  • 自动写软件
  • 我不是人,但是我比人更聪明,我是强大的Ai
Top