Fri Aug 17, 2018 5:32 pm
Login Register Lost Password? Contact Us


Easy setup of a fast HPCC System on AWS

Questions or comments related to Cloud Computing and the HPCC Systems Instant Cloud for AWS

Thu Jun 18, 2015 4:10 pm Change Time Zone

You can use the HPCC CloudFormation (CF) template and accompanying scripts to configure and deploy an HPCC System on AWS from your Windows computer in two steps.
    1. Copy the 14 accompanying scripts and your ssh pem file to an S3 bucket.
    2. Use CloudFormation on the AWS console to do the rest.
The following github repository has 1) the HPCC CF template, 2) the accompanying scripts, and 3) the document, EasyFastHPCCOnAWS.pdf, that provides details on the deployment process.

tlhumphrey2
 
Posts: 250
Joined: Mon May 07, 2012 6:23 pm

Tue Jun 30, 2015 1:56 pm Change Time Zone

Added two new capabilities to EasyFastHPCCoAWS.
    1. htpasswd authentication.
    2. ability to select HPCC Platform
https://github.com/tlhumphrey2/EasyFastHPCCoAWS
tlhumphrey2
 
Posts: 250
Joined: Mon May 07, 2012 6:23 pm

Thu Jul 02, 2015 3:16 pm Change Time Zone

On https://github.com/tlhumphrey2/EasyFastHPCCoAWS, I updated the document, EasyFastHPCCOnAWS.pdf to include a new appendix, Appendix D, that gives your AWS administrator detailed instructions for adding a new IAM group, Super-Power-Group, and adding you to that group.

If you are an AWS administrator, you don't need to do the instructions of Appendix D in order to use CloudFormation to build a stack that launches an HPCC System on AWS. And, if you are another IAM user, you only need to have your AWS administrator do the instructions of Appendix D if you get an error message while creating a stack that contains the words “not authorized to perform:iam:CreateRole”.
tlhumphrey2
 
Posts: 250
Joined: Mon May 07, 2012 6:23 pm

Wed Aug 05, 2015 1:54 pm Change Time Zone

I have added a new capability to the github repository, EasyFastHPCCoAWS, https://github.com/tlhumphrey2/EasyFastHPCCoAWS. This capability saves/restores data to/from S3 buckets from/to a THOR. And, the transfer of data to/from the S3 buckets from/to each THOR slave is done in parallel, which means Big Data can be transferred quickly.

There are two functions one uses:
    cp2S3FromMasterAndAllSlaves.pl (used to transfer data from THOR nodes to S3 buckets).
    cpFromS3ToMasterAndAllSlaves.pl (used to transfer data from S3 buckets to THOR nodes).
To use the 2nd function to transfer data from S3 buckets to THOR nodes, the THOR must be compatible with the THOR from which the data originally came, i.e. the THOR that cp2S3FromMasterAndAllSlaves.pl copied the data from. Compatible means: 1) both THORs have the same number in instances, 2) both THORs have the same number of slave nodes per instance, and 3) the disk space of the receiving THOR must be large enough to hold the data.

I haven’t, yet, written a document giving details on how to use these new functions. But in the README.md file of the repository, there is a brief explanation on how to use these functions.
tlhumphrey2
 
Posts: 250
Joined: Mon May 07, 2012 6:23 pm


Return to Cloud

Who is online

Users browsing this forum: No registered users and 1 guest