Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
SnatchRabbit
Feb 23, 2006

by sebmojo
Quick question here. I'm trying to use EBS to set up a Moodle platform. I've set up the environment, uploaded the moodle zip package which deploys correctly. I run through the web installation, connect to the db, but I get stuck at the pre-requisite checks with the following error:

code:
The Zip PHP extension is now required by Moodle, info-ZIP binaries or PclZip library are not used anymore.
I'm not quite certain how to 1) log into the EC2 instance that the EBS environment created and 2) how to install the php extension once I'm in there. I've looked through all the moodle docs but they all have you ssh into an EC2 instance. I also though that maybe the config settings for EBS - Zlib output compression: On/True would do the trick but that hasn't worked. Anything else I could try?

Adbot
ADBOT LOVES YOU

SnatchRabbit
Feb 23, 2006

by sebmojo

xpander posted:

I'm not sure where EBS comes in here - were you given a disk image to use somehow? EBS is Amazon's "hard drive in the cloud" offering, so it shouldn't have much to do with Moodle. But I don't know Moodle at all.

Sorry, I'm referring to Elastic Beanstalk. Essentially, Moodle is just a php application that I can download in zip form and elastic beanstalk will accept it. The trouble is getting the environment I set up to play nice with moodle vis a vis php extensions.

SnatchRabbit
Feb 23, 2006

by sebmojo
I have an older EC2 instance from around 2014, I think in EC2 Classic. I built a complete upgraded server but I can't seem to associate my elastic IP on my new server. I'm sure there's somethin I'm missing regarding EC2 classic, but does anyone have a quick tutorial on how to associate an existing elastic ip to a new server? I tried associating it in AWS, but it only allows me to associate with the old server.

edit: figured it out, had to migrate the elastic ip to the VPC scope of the new instance.

SnatchRabbit fucked around with this message at 20:56 on Aug 4, 2017

SnatchRabbit
Feb 23, 2006

by sebmojo
Speaking of certs is there a good resource for free practice exams? I just finished a course for associate solutions architect so I'm looking for some more exams to make sure I'm on my A game.

SnatchRabbit
Feb 23, 2006

by sebmojo

Rapner posted:

Not free but a cloud guru is very cheap.

Do they sell the practice exams separately? All I see are the $99 courses.

SnatchRabbit
Feb 23, 2006

by sebmojo

Seventh Arrow posted:

I booked my Solutions Architect - Associate exam for Feb 12 so I'm going to try and do as many labs and practice exams as I can until then. I've heard that there are a lot of scenario questions, so it seems best to have a well-rounded knowledge of the material instead of just mastering AWS trivia questions. Looking at the A Cloud Guru forums, however, it seems that the exams take a keen interest in subjects that one would never think to focus on initially - like bastion hosts, SWF use cases, and so on.

I'm also studying data engineering at a local place and the teacher who runs it says he has employers requesting AWS-certified people all the time - to the degree that he's thinking of starting an AWS course just to fill the demand. But I wonder if SAA alone will help me get my foot in the door. I mean, it's kind of the "paper trainee hat" level of AWS certs.

I scraped by on my AWS Solutions Architect exam. I took courses and practice exams. My experience is that I got way more questions about newer stuff like lambda than I was led to believe would be there. They seem to favor newer technologies even though the courses say to focus on core services like EC2, S3, EBS, etc. So definitely don't skip on the obscure stuff.

SnatchRabbit
Feb 23, 2006

by sebmojo
Has anyone messed around with Lambda functions? My boss asked me to come up with a function to do some penetration testing on our instances, essentially, check to see whether a certain port on a certain instance is open. I'm thinking about using the API gateway and maybe a simple webpage front end that will run the lambda function but I'm open to ideas. I kind of suck with coding in python but this should be a pretty simple. Anyone have any suggestions?

SnatchRabbit
Feb 23, 2006

by sebmojo

JHVH-1 posted:

Sounds pretty complicated, could just run nc or something. I had a script like this I wrote in python though a couple jobs ago to make sure a port was open so the app was running just using sockets I think.

Also a heads up, technically if you are doing penetration testing you are supposed to notify them https://aws.amazon.com/security/penetration-testing/
If you have your rules set up properly and only allow what you need to though you shouldn't be needing to poll this kind of thing frequently. Based on the network rules you should be able to tell if something is allowed or not. Just stumbled upon this while googling a second ago: https://nccgroup.github.io/Scout2/
You could also enable flow logs on a VPC: https://aws.amazon.com/blogs/security/how-to-optimize-and-visualize-your-security-groups/

Thanks, those are very useful links. Re: Lambda, I wasn't sure how feasible it was, it was more like an idea for us to dip our toe into serverless. Yeah, I'll bring that up that we are supposed to notify AWS. the problem with flow logs is that this is a build/test environment so I don't think there's much traffic flowing through as of yet.

SnatchRabbit
Feb 23, 2006

by sebmojo
Thanks for the replies. Another question: can anyone recommend a good S3 viewer for Mac OS?

SnatchRabbit
Feb 23, 2006

by sebmojo
Thanks again, yet another question: Is there a tool or command I can use to get a list of all the AMIs that are currently being used with EC2 in my environment? Essentially, we want to be able to prove for regulatory reasons we are only using base AMIs with all services disabled by default.

SnatchRabbit
Feb 23, 2006

by sebmojo
Has anyone used Lambda to parse SNS event? I've been trying to parse an AWSConfig rule's SNS events and using the data to do various things. I'm able to parse the data up to a point, but I can't seem to get the message JSON data that I want. I'm using python 2.7, and json, boto3 imports. I think my issue is the JSON is being read as a single key. Anyone know how to do this?

code:
from __future__ import print_function
import boto3
import json

def lambda_handler(event, context):
   message = event['Records'][0]['Sns']['Message']
   message = json.loads(message)
   print(message)


output:
code:
{u'configRuleNames': [u'restricted-sshv2'], u'awsRegion': u'us-west-2', u'messageType': u'ConfigRulesEvaluationStarted', u'recordVersion': u'1.0', u'notificationCreationTime': u'2018-03-29T22:08:34.631Z', u'awsAccountId': u'########'}
event data: https://dumptext.com/jyTfBcNl

SnatchRabbit fucked around with this message at 23:30 on Mar 29, 2018

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anyone know if its possible in cloudformation to do a GetAtt on a resource that's already been created manually? Like, not something from another stack, just an Arn from say an SNS topic you turned on by hand in the console? Yeah I could make a parameter and have the user enter it at runtime, but what's the fun in that. It doesn't look like this is possible but figured I would ask.

SnatchRabbit
Feb 23, 2006

by sebmojo
We currently have a docuwiki site up and running on EC2. It's essentially just a web server for various internal documentation. Some of the pages, however, contain iframe links to documents we have hosted on S3. The S3 policy just has those documents made public. I'd like to get this a bit more secure, but I'm trying to figure out the best solution. I thought about granting the instance a role with S3 privileges, but I think the iframe links for the documents would be technically be requested from the end user on the web and so giving the EC2 instance an s3 role wouldn't solve that problem, would it? I've also thought about having the docs stored locally on the instance and having it do an s3 sync periodically, but that seems kind of overwrought. Is there a simpler solution I'm overlooking? edit: would this be something I could set up with CORS configuration or presigned urls?

SnatchRabbit fucked around with this message at 21:27 on May 9, 2018

SnatchRabbit
Feb 23, 2006

by sebmojo

So, I thought about using cloudfront but I don't know if management will go for the added cost given that this is just an internal wiki/documentation site. I was thinking about adding CORS to the S3 bucket, and allowing GET method from the instance's domain. Would that work while keeping the S3 bucket private, or invalidate the entire purpose of having the bucket private in the first place?

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anyone know if it is possible to get the output from System Manager's Run Command all in a single file? I have a bunch of instances running a script but the outputs are all separated in S3 according to the commands and the instance IDs. I'd prefer to have all the outputs appended to a single file. Anyone know how?

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anyone have any links, tips, or tricks for managing patches in Systems Manager? We have a bunch of environments running Oracle apps on RHEL, so I'm just throwing out the bat signal for anything people have found that works.

SnatchRabbit
Feb 23, 2006

by sebmojo
I want to use IAM to control my users' access to Session Manager and restrict access to only certain instances in my AWS account. I found the example policies here which should give me most of what I need. I'd like this to be completely automated and expire session manager access after, say 24 hours, or whatever. What I'm thinking is using lambda to create the policy, and attach it to a user which is simple enough. The tricky part is going to be detaching/deleting the policy after the expiration period. I don't think I can use a single lambda to do everything since the timeout is like 5 minutes. I guess I could use that same lambda to invoke another lambda but I feel like that's an overwrought solution. Is there a way to either set a policy with an expiration, or some other way to achieve this that I'm not thinking of?

SnatchRabbit
Feb 23, 2006

by sebmojo
Athena sounds like it might be a good fit, but alternatively you could run a managed Postgres database in RDS and query that with say Lambda using Python, although, the timeout on Lambda queries are five minutes so you might need to break up the operations you're doing. Lambda might be a nice fit because assuming the queries run in a reasonable timeframe you could write the results directly to S3 or dynamodb using the boto3 library in Python.

SnatchRabbit fucked around with this message at 00:11 on Nov 2, 2018

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anyone have a recommendation for a good tutorial for appsync? I'm trying to put together a management tool for our clients to execute simple tasks on their Oracle enterprise environments. The idea is to have a simple webpage that displays the cloudformation stacks associated with a given environment and then buttons to, say, reboot all the EC2 instances for said environment or refresh a database, launch a new environemt, etc. I realize all this can be done in the AWS console but we'd like to simplify it for the client. At first I was messing around with CSS, api gateway and lambda functions, but I kind of suck at javascript programming so I was peaking around at appsync. Wondering if theres a good overview video and/or tutorial to see if it will do what I need with the least amount of friction.

SnatchRabbit
Feb 23, 2006

by sebmojo
I'm writing a management web page for some aws resources and it's pretty daunting. I'm basically having to rewrite portions of the AWS console so that clients can mash buttons to interact with the environments we've built. What I'm currently doing is sending commands, pulling data using api gateway and lambda then displaying it on the webpage. It's a ton of work write all the buttons just to get a stripped down version of Amazon's web GUI so I'm wondering if I'm going about this all wrong. Is there a simple way to use say cloudwatch dashboards or something and pipe that over to another webpage somehow? I know you make widgets to check on EC2 stats and such but it seems like you can only pull data out. Anyone tried something like this?

edit: to be clear, manipulating the aws resources isn't the hard part. It's getting status information back that's really proving to be a pain. I'm having to do multiple describe_instances, describe_instance_status calls and looping through everything to get what I information about the status of whatever it is I executed with the buttons.

edit2: I guess I could try to pull the events from the stacks in cloudformation as well, but that might be as much of a pain. We'll also be doing a lot orchestration through codedeploy so I might be able to get get something out of there....

SnatchRabbit fucked around with this message at 23:06 on Jan 16, 2019

SnatchRabbit
Feb 23, 2006

by sebmojo
I have a put together a cloudformation template for a job interview. Basically they want a simple web app to return the current datetime. I'm thinking of having an html page hosted in s3 with some javascript to hit api gateway which will then hit a lambda to return the date. Maybe I throw in a Rt53 entry. I want this to be as push button as possible but how do I get the html page into the cloudformation? Is there a way to code it in, or reference it from a git repo or something? Would I need to use codecommit/codedeploy?

SnatchRabbit
Feb 23, 2006

by sebmojo

Arzakon posted:

There isn't a great way to get an object into S3 from within CFN. One option would be to use a Lambda Custom Resource to drop the object in the S3 Bucket created in the CFN template. You essentially have to create another Lambda Function in the template, create the custom resource, which fires the Lambda Function to perform the put-object. If you are trying to look PRODUCTION READY you need to handle what the Custom Resource does on UPDATE (replace the file?), DELETE (delete the file, important for deleting the bucket). The custom resource code is probably more than all your other code but its what you do when you want to make AWS API calls that CFN can't do for you. If you can do it in 4096 characters you can put it inline in the CFN template, otherwise you have to stage it in S3.

I'd love it, but I could see someone whining about it being overly complex.

No matter what you do the first thing I'm looking at when I review your work is that your IAM and S3 Bucket policies are tight, really lock those thing down with resource level controls to show attention to detail.

Thanks, that's pretty much the conclusion I came to myself after a while. Writing the lambda to put the html into s3 wasn't all that bad, I just have to write the custom resource now. I'll have to manage the cleanup a bit and empty the bucket but i think itll be pretty slick if i get it working properly.

SnatchRabbit
Feb 23, 2006

by sebmojo

RVWinkle posted:

I'm glad you brought this up because it's something I have been thinking about. I'm hoping that in AWS::IAM::Policy I can just use something like Resource: !Ref S3Bucket.

Yup that's exactly what I did. I had the bucket set to public read but I might remove that since I have my lambda using extraargs to set the index.html to public when it uploads the file, so I don't think I really need a bucket policy, right?

SnatchRabbit
Feb 23, 2006

by sebmojo
Can someone answer a simple S3 encryption question for me? If I set default bucket encryption to S3 using AES256 that will encrypt objects in the bucket at rest, correct. Now what about in transit? Currently, I have a off-site QRadar server which I have configured to ingest Cloudtrail and GuardDuty logs with log sources. These log sources each have their own IAM user with a policy that allows them to access the S3 bucket. The cloudtrail encrypts objects with a KMS key in addition to the default bucket encryption. The Cloudtrail QRadar IAM user has access to this KMS key as well as the bucket and can fetch the logs no problem via Access Key and Secret Access key using Amazon AWS S3 REST API Protocol. GuardDuty only has the bucket level encryption and so it's IAM user policy only has access to the encrypted bucket. Now, my question is: will either of these scenarios encrypt the data in transit to the off-site Qradar? In either case, is there a relevant AWS Docs page explaining why or why not?

SnatchRabbit
Feb 23, 2006

by sebmojo

JHVH-1 posted:

You would use SSL/HTTPS so its encrypted in transit. You can enforce it by add a deny to the policy with a condition "aws:SecureTransport": "false"

https://docs.aws.amazon.com/config/latest/developerguide/s3-bucket-ssl-requests-only.html

Awesome, thanks!

SnatchRabbit
Feb 23, 2006

by sebmojo
I think this might be what you are looking for:

https://docs.aws.amazon.com/AmazonCloudWatch/latest/events/EC2_Run_Command.html

quote:

You can use Amazon CloudWatch Events to invoke AWS Systems Manager Run Command and perform actions on Amazon EC2 instances when certain events happen. In this tutorial, set up Run Command to run shell commands and configure each new instance that is launched in an Amazon EC2 Auto Scaling group. This tutorial assumes that you have already assigned a tag to the Amazon EC2 Auto Scaling group, with environment as the key and production as the value.

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anyone have experience with using Private API Gateways? I have a client that has their Datacenter's public IPs hitting an API Gateway which they've protected with a AWS WAF and a list of whitelisted ips and Cognito for authentication. They're asking for a Private API Gateway that's built "the right way", but their set up doesn't seem to be "private" by definition in AWS terms. I'm struggling to see how setting the APIGW to private would have much of a benefit. They already have a ton of VPN connections to the datacenters and we're currently troublshooting latency issues on an unrelated project, so my gut feeling is that routing all the API calls through those VPNs which are already causing issues is a recipe for pulling my hair out. I guess my question is this: Is there a relatively straight forward way to route the datacenter traffic through the VPNs and into the APIGW while keeping it relatively secure? They're not doing any VPC peering and their traffic is already a mess so I have my doubts.

SnatchRabbit
Feb 23, 2006

by sebmojo

Twlight posted:

to use a private aws api gateway, you need to setup a vpc endpoint in accounts you wish to use it in. I believe that is the only way to access private APIs, I *think* with the resource policy within the API you can white list IPs, I'm not sure if that would allow you to jump to it but i'm not sure.

Right that's my understanding as well. So in that VPC with the VPC endpoint would we need an EC2 instance to pass the information from the VPN to the APIGW?

SnatchRabbit
Feb 23, 2006

by sebmojo

Twlight posted:

I beleive so, this of course makes the entire setup less than ideal with that ec2 being the weak link in the chain. We use private APIGWs to provide a metadata service for customers within our different accounts, interesting data like proxy information. The other rub with private APIs is they don't accept any sort of CNAME so you're stuck with the long aws name provided.

Gotcha, that's kind of where I'm headed: get them to give me a good explanation of why this is even necessary given the amount of effort it would take the complexity of their network. Thanks!

SnatchRabbit
Feb 23, 2006

by sebmojo
Does anoyone know how to setup sns notifications for Config on a per rule basis? I know that I can remediate rules with PublishSNSTopic SSM document, but the remediation seems like a manual step unless I'm missing something. Basically I need some way to either query the config rule on a schedule and then publish to SNS or kick off the remediation on a schedule?

edit: maybe a python lambda for start_remediation_execution() on the particular rule running on a cloudwatch schedule? Or am I overthinking?

SnatchRabbit fucked around with this message at 19:26 on Aug 21, 2019

SnatchRabbit
Feb 23, 2006

by sebmojo
Cross posting from the SQL thread:

I have a client that wants to migrate two MSSQL database servers with 200+ db objects between them to AWS Cloud. Now, up until this point we've been fine using Data Migration Service to move the table data from their on-prem servers into AWS EC2 (RDS was ruled out due to cost). The problem is that DMS doesn't migrate indexes, users, privileges, stored procedures, and other database changes not directly related to table data. So now we have generate scripts by hand for these 200+, at minimum, objects. What I'm asking is, is there some hacky way to automate this migration or are we just stuck having to do it all by hand over and over again? Is there some option in DMS to make this happen?

SnatchRabbit
Feb 23, 2006

by sebmojo

Agrikk posted:

Always this.

For every project, you should be engaging your TAM (or entire account team) before you start the project. This way you don’t have to reinvent the wheel ad you’ll be given best practices for your project- ensuring you get it right straight from the beginning.

Thanks, this is for a client account. Would there be a TAM assigned even on a basic support plan?

SnatchRabbit fucked around with this message at 23:46 on Sep 24, 2019

SnatchRabbit
Feb 23, 2006

by sebmojo
Has anyone had any luck copying objects in bulk from a FTP server (or any server really) into s3, ideally using sync command but not required, and keeping the source file's attributes, such as file created/updated, tags etc, and populating that data into the s3 object's custom metadata? Really, my only requirement is I just want to know the source files creation date/time on the FTP and just have that value stuck into a custom metadata tag on S3. This sounds like an easy thing I'm just not seeing any obvious solution. I thought maybe something like S3Browser might have that built in but I'm just not seeing it.

SnatchRabbit
Feb 23, 2006

by sebmojo

deedee megadoodoo posted:

I think you’ll have to write a custom script. Either upload the files one at a time and set the tag on each object or run a sync and then have a script set the date tag on every object in s3.

Yeah, I think that's the conclusion I've come to. Just sketched out a basic bash script like ya:

code:
#!/bin/bash
for filename in ./*; do
  mtime=$(stat -c "%y" $filename) 
  echo $filename $mtime
  aws s3 cp $filename s3://$bucket/$filename --metadata '{"x-amz-meta-mtime":$mtime}'
done

Adbot
ADBOT LOVES YOU

SnatchRabbit
Feb 23, 2006

by sebmojo
Has anyone taken the AWS Networking Specialty Exam? I've been working with AWS for almost 10 years and I can build VPCs, TGWs, routes in my sleep at this point. I'm sure there are some service level gotchas but for anyone that has passed the exam how did you find it? I aced the Security Specialty and it was mostly rehashed SA Pro questions...

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply