How to mock Amazon S3 in an integration test
I'm trying to get an "walking skeleton" of my app up that will use S3 for persistence. I would like to use a fake S3 service so each developer's desktop can read/write at will.
I thought mocks3 would be perfect, as I could get a jetty 开发者_如何转开发server up in my jUnit tests. The problem is that mocks3 doesn't allow any writes. Not even to set it up as far as I can tell.
So how do others do this?
There is also an Findify s3mock tool written exactly for this purpose. It mocks the essential parts of AWS S3 API on top of local filesystem:
import io.findify.s3mock.S3Mock
S3Mock api = S3Mock.create(8001, "/tmp/s3");
api.start();
AmazonS3Client client = new AmazonS3Client(new AnonymousAWSCredentials());
// use local API mock, not the AWS one
client.setEndpoint("http://127.0.0.1:8001");
client.createBucket("testbucket");
client.putObject("testbucket", "file/name", "contents");
It's also easily embeddable and configuration-less.
If you're ok with depending on a running docker container, and want something well-supported, you could consider using localstack
Before running your tests, start S3 like so:
docker run --name localstack -d -p 5000:5000 -e SERVICES=s3:5000 localstack/localstack
And then stop it when tests complete like so:
docker stop localstack
You'll need to configure your S3 client to point to localhost:5000 for tests. In Java, this can be done like so:
AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(new AwsClientBuilder.EndpointConfiguration(
"http://localhost:5000",
"us-west-2"))
.build();
Another option is S3 ninja - emulates the S3 API for development and testing purposes.
Have a look at Adobe's S3Mock. This S3 Mock server can be started via a Docker container or JUnit 4/5 rules.
Tornado, a python web framework, has an example app that is just what you're looking for.
https://github.com/facebook/tornado/blob/master/demos/s3server/s3server.py
It can be used out of the box.
You can use scality s3server, in can run on your machine either using node.js or via docker and it gives you a local S3 service instance. It's open source under a BSD license github.com/scality/s3
LocalS3 is a good choice.
It's based on Netty and without heavy other dependencies. Startup a LocalS3 service or Docker container is super quick.
Support commonly used AmazonS3 APIs. Include versioned objects, multipart uploads, etc.
Support in-memory and persistence mode.
One option is to scrap the jetty server and use Apache VFS and the S3 plugin. With that, you can use the memory or file-based storage implementations for the integration testing.
精彩评论