Q153. A big company has a service to process gigantic clickstream data sets which are often the result of holiday shopping traffic on a retail website or sudden dramatic growth on the data network of a media or social networking site. It is becoming more and more expensive to analyze these clickstream datasets for its on- premise infrastructure. As the sample data set keeps growing fewer applications are available to provide a timely response. The service is using a Hadoop cluster with Cascading. How can they migrate the applications to AWS in the best way?
A.Put the source data to S3 and migrate the processing service to an AWS EMR hadoop cluster with Cascading. Enable EMR to directly read and query data from S3 buckets. Write the output to RDS database B.Put the source data to a Kinesis stream and migrate the processing service to AWS lambda to utilize its scaling feature. Enable lambda to directly read and query data from Kinesis stream. Write the output to RDS database C.Put the source data to a S3 bucket and migrate the processing service to AWS EC2 with auto scaling.Ensure that the auto scaling configuration has proper maximum and minimum number of instances.Monitor the performance in正确答案D
Cloudwatch dashboard. Write the output to DynamoDB table for downstream to process. D.Put the source data to a Kinesis stream and migrate the processing service to an AWS EMR cluster with Cascading. Enable EMR to directly read and query data from Kinesis streams. Write the output to Redshift.