AWS S3 with Spring Boot: Uploading and downloading file to Buckets

AWS S3 with Spring Boot: Uploading and downloading file to Buckets

So if are reading this Article means your a developer and your familiar with spring boot or AWS S3, just for SEO purpose i will brief both,so you can skip the introduction part and jump into coding part.As a developer Spring boot has become my vital Framework and preferred choice of man developer, it was very easy for me to pick up and learn coming from  PHP background, there is no way looking back toward `PHP`. so Spring Boot is a frame work which is built on spring framework  using all the Spring components, By using Spring Boot  and its started packages we start our project by nearly zero configuration, eliminating dealing with dirty `xml`, using Spring boot we can  write Micro services, Web Application,Command line Application,Scheduler and so on...

As application are moving towards cloud, and we are rapidly picking up PAAS (Platform as a Service), so such service is provided by Amazon Cloud Service is S3 Simple Storage Solution, this Service provides set of API to manage file like upload,delete, restrict access or make it public, and Infrastructure is provides by AWS it also manages the replication load distribution to provide maximum up time for certain cost "$".

So now we shall write Simple APIs to upload and download files from S3 Buckets.

Dependency Required

Now we shall add the required dependency used in our spring boot. main dependency required are `aws-java-sdk`,`spring-boot-web-starter`,`commons-io`,`commons-upload` so that we can upload file to AWS S3.

With updated S3 Client on different framework. but can be modified to Spring boot easily.

AWS S3 & Micronaut: File upload and download using Java
I started learning new framework called Micronaut, Micronaut is lightweight Javaweb framework, Which is cloud native in nature which support Java, Groovy andKotlin and main advantage of the micronut its light weight, has low memoryfootprint and fast, one of the way it achieve its Fast nature and …
<dependencies>
        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-web</artifactId>
        </dependency>

        <!-- https://mvnrepository.com/artifact/com.amazonaws/aws-java-sdk -->
        <dependency>
            <groupId>com.amazonaws</groupId>
            <artifactId>aws-java-sdk</artifactId>
            <version>1.11.560</version>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger2</artifactId>
            <version>2.8.0</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>io.springfox</groupId>
            <artifactId>springfox-swagger-ui</artifactId>
            <version>2.8.0</version>
            <scope>compile</scope>
        </dependency>
        <dependency>
            <groupId>commons-fileupload</groupId>
            <artifactId>commons-fileupload</artifactId>
            <version>1.3.1</version>
        </dependency>

        <!-- https://mvnrepository.com/artifact/org.apache.commons/commons-io -->
        <dependency>
            <groupId>org.apache.commons</groupId>
            <artifactId>commons-io</artifactId>
            <version>1.3.2</version>
        </dependency>

        <dependency>
            <groupId>org.springframework.boot</groupId>
            <artifactId>spring-boot-starter-test</artifactId>
            <scope>test</scope>
        </dependency>
    </dependencies>

Creating a Connection to AWS S3 Bucket.

To create there are few prerequisite like AWS application Key, AWS Secret key, And Empty bucket created with write access for the application key. so now we shall proceed with creating configuration with provides AmazonS3Client Instance.

@Configuration
public class S3Config {
    @Value("${s3.access.name}")
    String accessKey;
    @Value("${s3.access.secret}")
    String accessSecret;

    @Bean
    public AmazonS3Client generateS3Client() {
        AWSCredentials credentials = new BasicAWSCredentials(accessKey,accessSecret);
        AmazonS3Client client = new AmazonS3Client(credentials);
        return client;
    }
}

And Update properties with key used in Value annoataion.

Class Which Handles the Upload and download Files from the AWS S3.

I have Name this class As S3Factory, few qualified design experts my have concerns about the name, but this acts as proxy class to AWS S3 Client so proxy factory can also be called as Factory :-P,No we shall begin with code

Service
public class S3Factory {

    @Autowired
    AmazonS3Client amazonS3Client;

    @Value("${s3.buckek.name}")
    String defaultBucketName;

    @Value("${s3.default.folder}")
    String defaultBaseFolder;

    public List<Bucket> getAllBuckets() {
        return amazonS3Client.listBuckets();
    }


    public void uploadFile(File uploadFile) {
        amazonS3Client.putObject(defaultBucketName, uploadFile.getName(), uploadFile);
    }

    public void uploadFile(String name,byte[] content)  {
        File file = new File("/tmp/"+name);
        file.canWrite();
        file.canRead();
        FileOutputStream iofs = null;
        try {
            iofs = new FileOutputStream(file);
            iofs.write(content);
            amazonS3Client.putObject(defaultBucketName, defaultBaseFolder+"/"+file.getName(), file);
        } catch (FileNotFoundException e) {
            e.printStackTrace();
        } catch (IOException e) {
            e.printStackTrace();
        }

    }

    public byte[] getFile(String key) {
        S3Object obj = amazonS3Client.getObject(defaultBucketName, defaultBaseFolder+"/"+key);
        S3ObjectInputStream stream = obj.getObjectContent();
        try {
            byte[] content = IOUtils.toByteArray(stream);
            obj.close();
            return content;
        } catch (IOException e) {
            e.printStackTrace();
        }
        return null;
    }

}

Writing Rest API to upload and download files through and fro from AWS S3 buckets.

@RestController
public class S3StorageController {
    @Autowired
    S3Factory s3Factory;

    @GetMapping(path = "/buckets")
    public List<Bucket> listBuckets(){
        return s3Factory.getAllBuckets();
    }

    @PostMapping(path = "/upload",consumes = {MediaType.MULTIPART_FORM_DATA_VALUE})
    public Map<String,String> uploadFile(@RequestPart(value = "file", required = false) MultipartFile files) throws IOException {
        s3Factory.uploadFile(files.getOriginalFilename(),files.getBytes());
        Map<String,String> result = new HashMap<>();
        result.put("key",files.getOriginalFilename());
        return result;
    }

    @GetMapping(path = "/download")
    public ResponseEntity<ByteArrayResource> uploadFile(@RequestParam(value = "file") String file) throws IOException {
        byte[] data = s3Factory.getFile(file);
        ByteArrayResource resource = new ByteArrayResource(data);

        return ResponseEntity
                .ok()
                .contentLength(data.length)
                .header("Content-type", "application/octet-stream")
                .header("Content-disposition", "attachment; filename=\"" + file + "\"")
                .body(resource);

    }
}

So now now open the API testing tool and call the API to check weather the code is working :-P, basically there are api to list bucket, upload the file and download the file.

And Snippet for main Application Class

@SpringBootApplication
@EnableSwagger2
public class S3demoApplication {

    public static void main(String[] args) {
        SpringApplication.run(S3demoApplication.class, args);
    }

}
Spring Boot: Uploading and download a file from GCP Storage (Google Cloud Storage)
Hello fellow developer, if you have landed on this page by search that means mySEO is working and google official documentation is not yet updated for the newapproach. or if you have visited just to learn let me add few more details aboutGCP and spring boot. Google Cloud Platform or GCP is anoth…