Q153. A big company has a service to process gigantic clickstream data sets which are often the result of holiday shopping traffic on a retail website or sudden dramatic growth on the data network of a media or social networkin

欢迎免费使用小程序搜题/刷题/查看解析,提升学历,成考自考报名,论文代写、论文查重请加客服微信skr-web


Q153. A big company has a service to process gigantic clickstream data sets which are often the result of holiday shopping traffic on a retail website or sudden dramatic growth on the data network of a media or social networking site. It is becoming more and more expensive to analyze these clickstream datasets for its on- premise infrastructure. As the sample data set keeps growing fewer applications are available to provide a timely response. The service is using a Hadoop cluster with Cascading. How can they migrate the applications to AWS in the best way?

A.Put the source data to S3 and migrate the processing service to an AWS EMR hadoop cluster with Cascading. Enable EMR to directly read and query data from S3 buckets. Write the output to RDS database
B.Put the source data to a Kinesis stream and migrate the processing service to AWS lambda to utilize its scaling feature. Enable lambda to directly read and query data from Kinesis stream. Write the output to RDS database
C.Put the source data to a S3 bucket and migrate the processing service to AWS EC2 with auto scaling.Ensure that the auto scaling configuration has proper maximum and minimum number of instances.Monitor the performance in
Cloudwatch dashboard. Write the output to DynamoDB table for downstream to process. D.Put the source data to a Kinesis stream and migrate the processing service to an AWS EMR cluster with Cascading. Enable EMR to directly read and query data from Kinesis streams. Write the output to Redshift.
正确答案D
访客
邮箱
网址

通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作


  • 自动写文案
  • 自动写小说
  • 马上扫码让Ai帮你完成工作
通用的占位符缩略图

人工智能机器人,扫码免费帮你完成工作

  • 自动写论文
  • 自动写软件
  • 我不是人,但是我比人更聪明,我是强大的Ai
Top