Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
MF_James
May 8, 2008
I CANNOT HANDLE BEING CALLED OUT ON MY DUMBASS OPINIONS ABOUT ANTI-VIRUS AND SECURITY. I REALLY LIKE TO THINK THAT I KNOW THINGS HERE

INSTEAD I AM GOING TO WHINE ABOUT IT IN OTHER THREADS SO MY OPINION CAN FEEL VALIDATED IN AN ECHO CHAMBER I LIKE

a hot gujju bhabhi posted:

Thanks for the info, super helpful. I looked at the traffic using tcpdump as suggested and it definitely initiates using a different port each time, but always requests port 80 on the LB. Is this a problem? Sorry for the potentially stupid question, I'm far more Dev than Ops unfortunately.

It should request port 80 because that's the port on the LB that's listening for requests to do its' LB thing. I mean, you could change that port, but then it would always request whatever port you changed it to because the load balance service is listening on that port, but the source port should always change for each connection.

Adbot
ADBOT LOVES YOU

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
After some further investigation it seems like actually the load balancer itself is doing okay, but one of the servers just seems to have a much harder time processing requests. In other words its CPU spikes regularly even though it's handling the same volume as the others. Unfortunately this was all set up long before my arrival so these VMs are far from immutable, in fact they're significantly mutated, so it wouldn't surprise me if there's some configuration defect on that specific VM.

Thanks for the help anyway guys, I definitely learned a lot.

Pollyanna
Mar 5, 2005

Milk's on them.


Throwing in my two cents on DocumentDB - it has performed far worse than MongoDB in each of our performance tests, to the point where we jump from 20% of our time in Mongo to >80% of our time in DocDB. I can’t in all honesty recommend it if latency/performance is critical.

JHVH-1
Jun 28, 2002

a hot gujju bhabhi posted:

After some further investigation it seems like actually the load balancer itself is doing okay, but one of the servers just seems to have a much harder time processing requests. In other words its CPU spikes regularly even though it's handling the same volume as the others. Unfortunately this was all set up long before my arrival so these VMs are far from immutable, in fact they're significantly mutated, so it wouldn't surprise me if there's some configuration defect on that specific VM.

Thanks for the help anyway guys, I definitely learned a lot.

Automate the whole server config and then burn them all to the ground! DevOps anarchy!

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
You can use a tool like goss to identify a lot of system configurations, emit a policy, and then use the tests to validate any new container or EC2 instance you build with automation.

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION

necrobobsledder posted:

You can use a tool like goss to identify a lot of system configurations, emit a policy, and then use the tests to validate any new container or EC2 instance you build with automation.

This looks amazing, but sadly Linux only 😢

Thanks Ants
May 21, 2004

#essereFerrari


How about https://devblogs.microsoft.com/scripting/reverse-desired-state-configuration-how-it-works/

Lily Catts
Oct 17, 2012

Show me the way to you
(Heavy Metal)
For learning purposes, I'm creating a Twitter bot that hourly tweets random lyrics/phrases and I need to decide whether I should implement it on Lambda or EC2 (free tier).

The basic gist is:

code:
1 Read list A containing items to be tweeted
2 Read list B containing items that have already been tweeted for the day
3 Pick an item from list A at random
4 If that item is not in list B,
5   Tweet it
6   Add tweeted item to list B
7 Else,
8   Go to 3 (assume we have at least 24 items to cover an entire day)
9 If list B already has 24 items, clear it
Based on my knowledge, for Lambda you have to get the data from somewhere (as it's serverless), so it would be either in simple JSON files in S3 (list B could be simpler even), or DynamoDB. Is simple file I/O in Lambda permissible and well within the free tier? To implement on EC2 would be more straightforward, but I've done that before (I have a Twitter bot running on Heroku that does the same thing) and I prefer to go down the road not taken.

What do you think? At most the bot will only run 24 times a day.

The Fool
Oct 16, 2003


Schneider Heim posted:

I prefer to go down the road not taken.

Do it in azure functions

xpander
Sep 2, 2004

Schneider Heim posted:

For learning purposes, I'm creating a Twitter bot that hourly tweets random lyrics/phrases and I need to decide whether I should implement it on Lambda or EC2 (free tier).

The basic gist is:

code:
1 Read list A containing items to be tweeted
2 Read list B containing items that have already been tweeted for the day
3 Pick an item from list A at random
4 If that item is not in list B,
5   Tweet it
6   Add tweeted item to list B
7 Else,
8   Go to 3 (assume we have at least 24 items to cover an entire day)
9 If list B already has 24 items, clear it
Based on my knowledge, for Lambda you have to get the data from somewhere (as it's serverless), so it would be either in simple JSON files in S3 (list B could be simpler even), or DynamoDB. Is simple file I/O in Lambda permissible and well within the free tier? To implement on EC2 would be more straightforward, but I've done that before (I have a Twitter bot running on Heroku that does the same thing) and I prefer to go down the road not taken.

What do you think? At most the bot will only run 24 times a day.

This is easily doable in Lambda, reading files from S3 is a very straightforward operation. If you haven't used FaaS before, that sounds like a great first project. Just be sure to do your file i/o in /tmp. For bonus points, figure out how to keep the function warm and check /tmp first to see if your files are still there before downloading from S3.

Pollyanna
Mar 5, 2005

Milk's on them.


I have a CloudWatch log group (made of multiple log streams) and a feed of CloudTrail events (basically a change log/access log for an S3 bucket) that I want to compose into a single log stream for ease of viewing, searching, and auditing. I was under the impression that I could accomplish this with CloudWatch itself, but on further investigation, I’m not sure if that’s actually the case. What’s my best option for composing a CloudWatch log group and a series of CloudTrail events into one single log stream?

deedee megadoodoo
Sep 28, 2000
Two roads diverged in a wood, and I, I took the one to Flavortown, and that has made all the difference.


You know what's dumb as hell? The fact that AWS doesn't publish a list of changes that are coming. Sure, your TAM can tell you what changes are coming, but it's under NDA. Also not every org has a TAM and in some orgs the TAM is gated through bureaucracies that don't pass along information that may not be relevant to them but is for other people.

That said, it's loving hilarious that one of my coworkers spent like 2 weeks rebuilding some of our infrastructure only for an announcement to come out today that would have eliminated those two weeks of work.

Agrikk
Oct 17, 2003

Take care with that! We have not fully ascertained its function, and the ticking is accelerating.
AWS employees don’t hear about upcoming launches at retInvent until re:invent so yeah, I feel your pain.

Cancelbot
Nov 22, 2006

Canceling spam since 1928

In my previous org experience (I officially become a TAM in one day!!) If you're doing a thing that's pushing the edges of an AWS service and have a nice TAM you can get included on alpha and beta programs with the service teams being your point of call.

I've done one usability study of an upcoming product and my previous team are on an alpha product. Both of which I can't tell you anything more specific, but getting involved is definitely related to several stars aligning.

Cancelbot fucked around with this message at 13:54 on Nov 17, 2019

FamDav
Mar 29, 2008
We’re working on being better about this, at least in the groups I work with like containers and app mesh.

IMO while re:invent and other conferences are a great way to broadcast major features/services to customers, the best thing we can do is get our plans and designs in front of as many potential users as possible, as early as possible.

Cerberus911
Dec 26, 2005
Guarding the damned since '05
Loving all the new announcements coming out prior to re:Invent.

This will be my first time going to re:invent. Any tips from people who have been there before?

Agrikk
Oct 17, 2003

Take care with that! We have not fully ascertained its function, and the ticking is accelerating.
Be careful how you book sessions. If you are sloppy you can walk ten miles in a single day, like a customer did.

Adhemar
Jan 21, 2004

Kellner, da ist ein scheussliches Biest in meiner Suppe.

Agrikk posted:

Be careful how you book sessions. If you are sloppy you can walk ten miles in a single day, like a customer did.

New service: AWS Exercise.

deedee megadoodoo
Sep 28, 2000
Two roads diverged in a wood, and I, I took the one to Flavortown, and that has made all the difference.


Agrikk posted:

Be careful how you book sessions. If you are sloppy you can walk ten miles in a single day, like a customer did.

Last year was my first time and it was definitely a learning experience. My big takeaways were:

1) Try to book all your sessions for a day in the same venue. Or at least schedule things so you only need to change venues once in a day.

2) Give yourself at least an hour to move between venues.

3) The MGM Grand is the worst hotel option because of 1 and 2.

Scrapez
Feb 27, 2004

I have an environment setup in AWS where I have a bastion instance in a public subnet and multiple other ec2 instances in private subnets. I have an EFS setup and mounted on all the machines in the private subnets.

What is the best method for transferring files between my PC and the EFS? As it sits now, to get a file from the EFS to my local machine, I have to scp it out to the bastion instance and then scp it from the bastion back to my PC.

I noticed AWS DataSync but that seems to be for copying huge swaths of data to an EFS in one fell swoop rather than transferring individual log files from time to time like I'm trying to do.

Is there a better way than secure copying the file twice to get it back to my machine?

fluppet
Feb 10, 2009
Either proxy a ssh session through the bastion and use scp or i believe you can tunnel it through a ssm-session

trem_two
Oct 22, 2002

it is better if you keep saying I'm fat, as I will continue to score goals
Fun Shoe
Possibly overkill, but you could configure an S3 VPC Endpoint, and push/retrieve the files from S3.

crazypenguin
Mar 9, 2005
nothing witty here, move along
IMO, you should never actually be logged into bastions.

Create local SSH configs that use ProxyJump to bounce through the bastion. Then it’s totally transparent you’re using a bastion at all. scp away.

Docjowles
Apr 9, 2009

I like all of the above suggestions. Either configure SSH proxy or some kind of S3 + Lambda setup that copies files to the destination every time a new object shows up in the bucket.

Scrapez
Feb 27, 2004

crazypenguin posted:

IMO, you should never actually be logged into bastions.

Create local SSH configs that use ProxyJump to bounce through the bastion. Then it’s totally transparent you’re using a bastion at all. scp away.

How didn't I know about this? Awesome! Thank you.

putin is a cunt
Apr 5, 2007

BOY DO I SURE ENJOY TRASH. THERE'S NOTHING MORE I LOVE THAN TO SIT DOWN IN FRONT OF THE BIG SCREEN AND EAT A BIIIIG STEAMY BOWL OF SHIT. WARNER BROS CAN COME OVER TO MY HOUSE AND ASSFUCK MY MOM WHILE I WATCH AND I WOULD CERTIFY IT FRESH, NO QUESTION
Curious to hear some feedback from anyone who has tried LocalStack for developing AWS stuff without actual resource spin up and pull down. It looks super promising to me, but I've never used it in practice, I'm keen to hear some thoughts from you much more experienced gurus?

Blinkz0rz
May 27, 2001

MY CONTEMPT FOR MY OWN EMPLOYEES IS ONLY MATCHED BY MY LOVE FOR TOM BRADY'S SWEATY MAGA BALLS
It's good and definitely worth using but there are some weird gotchas the further out from the popular services you get.

necrobobsledder
Mar 21, 2005
Lay down your soul to the gods rock 'n roll
Nap Ghost
LocalStack has quirks around stuff like IAM that won't work terribly well if you're doing complex stuff like cross-account workflows but it's fine for unit tests in theory. Problem is, if you're doing that and calling it a unit test, you might as well use Moto. Moto's bugs are easier to understand and hack through and it also works as a network service, too.

Blinkz0rz
May 27, 2001

MY CONTEMPT FOR MY OWN EMPLOYEES IS ONLY MATCHED BY MY LOVE FOR TOM BRADY'S SWEATY MAGA BALLS
Fwiw localstack is a wrapper over Moto.

To give the OP some perspective, our product relies heavily on S3, SQS, and SNS which work pretty well in localstack so we use it to spin up a solid approximation of our environment on each engineer's machine rather than having a dev AWS account for that kind of stuff.

the talent deficit
Dec 20, 2003

self-deprecation is a very british trait, and problems can arise when the british attempt to do so with a foreign culture





a hot gujju bhabhi posted:

Curious to hear some feedback from anyone who has tried LocalStack for developing AWS stuff without actual resource spin up and pull down. It looks super promising to me, but I've never used it in practice, I'm keen to hear some thoughts from you much more experienced gurus?

localstack is ok for when you just need to satisfy a dependency and can't be bothered to swap out sqs/sns/whatever for something locally runnable but it's not a very accurate recreation of the services it replaces so you can't rely on it if you're testing the thing itself

BaseballPCHiker
Jan 16, 2006

I just took and passed my AWS solutions architect exam and wanted some advice on were to go from here.

I work in networking primarily, I dont use AWS at all in my day to day work. But I had a lot of fun studying for the exam, and did some rinky dink stuff like building wordpress sites playing around with EC2, building home backup solutions for myself with S3 etc.

I dont expect to get some high paying job using AWS based just on that cert but I also cant afford to take a pay cut for an entry level job that uses it. I'm also not a programmer that can take advantage of a lot of AWS services.

Based on all of the above are any additional certs worth studying for? Or am I better off just trying to build stuff for myself and my organization?

What a long winded post, I guess I just wanted to celebrate passing and to say how fun the exam was to study for.

Docjowles
Apr 9, 2009

Grats on passing! At this point, getting actual experience with real, non-toy work projects will probably be the most valuable thing. If you can in your current role. The cert is just a starting point, but a very good one, since AWS is so sprawling and complex that it's very easy to design a system that totally sucks or costs 10x what it should. So having that foundation of "here is how not to immediately shoot your foot off in the cloud" is awesome :)

There's also the "DevOps" certification track, which as I understand it is more day two operational tasks. How to keep the thing the solutions architect handed you running, monitored, secured, etc. If that sounds interesting that's another area you could explore.

There are TONS of jobs out there looking for people with AWS expertise. If you can get some real projects under your belt you should definitely be able to command a raise and/or a new job if you want it. I've even come across multiple networking specific jobs, and I wasn't even directly looking for them. Usually wanting someone with a traditional networking/security background plus some cloud chops to build a hybrid on-prem/cloud solution. So if you can prove you know your way around both a router CLI and VPCs, Direct Connect, Transit Gateways, etc there are opportunities for you out there.

BaseballPCHiker
Jan 16, 2006


Thanks for the info and the kind words Doc. I'll try and apply the knowledge to projects at work that make sense, and if not just keep plugging away here and there on personal projects for fun.

fluppet
Feb 10, 2009
Got my devops pro booked for later this week, how representational are the acg practice exams?

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
For AWS Transfer, is there any way to set the "Restricted" flag for a user account programmatically? Using CLI I'm not seeing any value in the "describe-user" JSON that matches that setting and don't see it as an option in the "create/user-user" command. I have a client that wants to use the same bucket for a ton of clients and generated and locking their user accounts to their S3 key location without clicking through dozens of accounts would be nice.

Pollyanna
Mar 5, 2005

Milk's on them.


PierreTheMime posted:

For AWS Transfer, is there any way to set the "Restricted" flag for a user account programmatically? Using CLI I'm not seeing any value in the "describe-user" JSON that matches that setting and don't see it as an option in the "create/user-user" command. I have a client that wants to use the same bucket for a ton of clients and generated and locking their user accounts to their S3 key location without clicking through dozens of accounts would be nice.

I takes to my company’s TAM recently asking if it was possible, and their response was basically “we’ll keep it in mind”. Prolly not happening anytime soon. I feel your pain friend :(

We’ve been using scope-down policies in lieu of that option. The details escape me right now, but I can ask our SREs to explain tomorrow. There might even be docs on it somewhere, but I don’t know for sure.

Pollyanna
Mar 5, 2005

Milk's on them.


Where did my post go? EDIT: Oh there it is.

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord

Pollyanna posted:

I takes to my company’s TAM recently asking if it was possible, and their response was basically “we’ll keep it in mind”. Prolly not happening anytime soon. I feel your pain friend :(

We’ve been using scope-down policies in lieu of that option. The details escape me right now, but I can ask our SREs to explain tomorrow. There might even be docs on it somewhere, but I don’t know for sure.

I got a response back from support and the answer is (as is a lot of things) it works but only if you enter it in a specific undocumented way. Now with an example I should be able to get everything going. They said they already put in the documentation update request, so there's progress at least.

Support notes: "In this command note we dont need to use the "--home-directory" parameter. This is because its specified inside "--home-directory-mappings" under the target."

Example:
code:
aws transfer create-user --server-id SERVERID --user-name user1 --role arn:aws:iam::ACCOUNTID:role/sftp-s3-transfer --home-directory-type LOGICAL --home-directory-mappings Entry="/",Target="/mybucket/sftp/user1"
{
    "UserName": "user1",
    "ServerId": "SERVERID"
}

PierreTheMime fucked around with this message at 20:57 on Dec 7, 2019

Hughlander
May 11, 2005

Not sure what thread this should go to, but I want to get an elastic ip and vpn it to a set of containers on my NAS. Is that just going to be a Vpc, elastic ip and vpn endpoint? Or is there more to it than that?

Rational: I need to upgrade my ec2 to a higher machine class or I could just use my home nas but I don’t want people knowing my home ip / have a stable ip when it changes.

Adbot
ADBOT LOVES YOU

PierreTheMime
Dec 9, 2004

Hero of hormagaunts everywhere!
Buglord
I'm bouncing all over the place with services lately and have hit another issue. I'm trying to use a Cognito User Pool to control access to an API Gateway, but while setting up my Authorizer it fails testing (returns "Unauthorized request") using an access_token from my REST call to Cognito. I've looked over a few example videos I seem to be doing the same things, mine just doesn't work. :( Anyone hit this issue before? I'm sure it's something really simple I'm missing.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply