Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us money per month for bills, and since we don't believe in showing ads to our users, we try to make the money back through forum registrations.
 
  • Post
  • Reply
skipdogg
Nov 29, 2004
Resident SRT-4 Expert

KMS 1.2 running on Server 2003 with Windows 7 clients is fine. If you want to use KMS with Windows 8 or Server 2012 you'll need to move KMS to a newer host running Server 2008 or later.

To the best of my knowledge there is no KMS in any Server 2003 product. We just use older style Volume License keys with those installs. They're not even MAK.

Adbot
ADBOT LOVES YOU

Naramyth
Jan 22, 2009

Australia cares about cunts. Including this one.

devmd01 posted:

Oh boy, we're taking email and office ~to the cloud~ and going office365 across the enterprise.

What are some major tips/tricks/suggestions we should consider in our implementation? I won't be doing the back-end work, more on the client configuration side but any tips are welcome.

You will never get delayed email issues resolved. Getting out of PST hell with 25GB inboxes is worth it though.

Wicaeed
Feb 8, 2005
Quick question:

Our company is looking to deploy a bunch of servers in Europe. We already have a team over there that operates semi-independently from our company.

These servers there need to talk to our production domain, and in some case use the same user that auto-logs on to production servers.

Would we be better off just making a new AD site for our new EU datacenter and then install some RODC's there, and host the various roles (dhcp/dns) on those RODC's, or should we create a new child domain in this case?

I realize it's a really open-ended question, but any help is appreciated.

incoherent
Apr 24, 2004

01010100011010000111001
00110100101101100011011
000110010101110010
I would recommend a new geo-specific domain in your forest if you're moving to another continent. If poo poo gets real bad real fast, that RODC wont be of much use.

The admins can work semi-autonomously and would simplify management and reporting quite a bit.

hihifellow
Jun 17, 2005

seriously where the fuck did this genre come from

hihifellow posted:

KMS is breaking me. We have a Server 2003 box hosting KMS 1.2. It has a Windows 7 KMS installed and activated. Our Windows 7 clients are activating without intervention so everything appears to be working, but according to Microsoft I should be using a Server 2003 KMS key. I have no idea if these exist, and I certainly don't have one listed in my spreadsheet of keys. Are we heading towards ruin by using a Win7 key on a 2003 host, or should I just leave it since it seems to be working?

Finally found out the host key only determines what KMS will activate; you can throw a 2008 R2 C license on a 2003 host and it'll work fine. I couldn't find this information anywhere except on a technet post and that's how it appears to be working so 10 points from Microsoft for not documenting this poo poo :argh:

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Wicaeed posted:

Quick question:

Our company is looking to deploy a bunch of servers in Europe. We already have a team over there that operates semi-independently from our company.

These servers there need to talk to our production domain, and in some case use the same user that auto-logs on to production servers.

Would we be better off just making a new AD site for our new EU datacenter and then install some RODC's there, and host the various roles (dhcp/dns) on those RODC's, or should we create a new child domain in this case?

I realize it's a really open-ended question, but any help is appreciated.

I'm a big fan of keeping it as simple as possible. If another AD site and delegating permissions to the necessary folks works for you, go for it. Why RODC's? Is security an issue there? Personally unless you have a really good reason I prefer the single domain, single forest model with OU based delegation.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
We have an RDSGateway for staff to access stuff on the road.

How should I conveniently package the required certificate file for staff who work from home on non-domain-joined PCs? Is there anything more elegant than a .bat that runs certutil?

Jesus Stick
Dec 14, 2004

Bomb Hills, Not Countries
We have a new client, who is a complete wreck. We're working to get all this poo poo replaced, but in the mean time, we gotta limp them along. Here is the situation:

Site: Carson has 3 domain controllers (INTERNET, ENGINEERING, and ACCOUNTING)

ENGINEERING is the FSMO Role holder of all five roles, and as luck would have it, only boots into Safe Mode, and has for apparently 60 days now. In any other scenario, I would just seize the roles to INTERNET, which is the other DNS server, and turn off ENGINEERING for good.

Unfortunately, we can't do that. ENGINEERING is also the main file server, with a BUNCH of IDE drives attached to it and shared out to users, and it is their main file storage. Until now, everything has been limping along fine, but today, poo poo hit the fan with this DSRM DC and domain authentications.

That said, I know when you seize roles from a DC, it should never touch the network again, but what if I seize the roles and just leave it in DSRM so the shares stay available. I'm talking TBs of data, so it's not easy or feasible to plug these drives in somewhere else (whitebox servers galore). I literally can not find ONE article talking about this. Anyone have any thoughts?

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Spend the 300 bucks and open a case with Microsoft is my honest opinion. It's cheap insurance to cover your rear end.

Are there backups of anything? Getting the shared data mounted anywhere else shouldn't be much of an issue, assuming there's actually a backup to restore from. I'm going to guess there isn't though. Oh how I don't envy you MSP folks.

edit: After looking around a bit, if you seize the roles to INTERNET, and do a dcpromo/forceremoval on ENGINEERING you might be OK. That'll nuke all AD data from the server. You'll have to manually cleanup the metadata, but that's not a big deal. I'm a chickenshit though and would have Microsoft on the phone while I did it. I've been in corp IT too long, always thinking about covering my rear end if something goes wrong.

skipdogg fucked around with this message at 06:55 on Aug 7, 2013

Jesus Stick
Dec 14, 2004

Bomb Hills, Not Countries
When we came on board, the backups were a Free File Sync utility running at every location, copying files from shares, to other shares, from other servers, etc. It was a loving nightmare. So no, no legitimate backups of that server. We have an Axcient backup appliance in now, but it can't do this server... so that's fun.

Frozen Peach
Aug 25, 2004

garbage man from a garbage can
Is there a way to give a user admin rights but only when running a specific application? We'd like to grant a user access to install updates to a few programs that constantly bug the user about updates, but we don't want to have to go out to these users' desks and manually type in an admin account every time. I've tried granting the user access to the specific folders and registry entries the application uses when updating, but it still asks for admin rights to run the updater. I'd love to be able to white-list a folder/publisher and be able to give users access to run those installers "as an admin" without giving them full admin rights.

It looks like AppLocker might be able to do this, but that doesn't work on Windows 7 Professional. Does Software Restriction Policies give me this ability, or is that purely for not allowing apps to be run? I feel like there's something I'm missing that I should be able to grant more stringent access to installers, without having to manually type in an admin password into the few computers that need a software update.

Helushune
Oct 5, 2011

Is there a way to have UAC disabled on Windows 8 Enterprise and still have Metro applications work properly?

UAC has to be disabled on my work's domain in order for certain things to work properly and I've been trying to write a group policy (on Server 2008R2) for our new Surface Pros and for future testing of Windows 8. It should also be noted that the user accounts that would be using the Surfaces and laptops/desktops require local administrator. I've been able to get the following results:
  • UAC disabled, "Metro applications refuse to run without UAC enabled" error message
  • UAC disabled, "Metro applications refuse to run with any account that has local administrator" error message
  • UAC enabled, Metro applications run fine but roaming profiles are broken, no one is able to save to any local drives, all other terrible things when UAC is enabled
I've tried setting the EnableLUA dword to 1 which just took me to bullet point #2 and I've been unable to come up with a solution.

capitalcomma
Sep 9, 2001

A grim bloody fable, with an unhappy bloody end.

Frozen-Solid posted:

Is there a way to give a user admin rights but only when running a specific application? We'd like to grant a user access to install updates to a few programs that constantly bug the user about updates, but we don't want to have to go out to these users' desks and manually type in an admin account every time. I've tried granting the user access to the specific folders and registry entries the application uses when updating, but it still asks for admin rights to run the updater. I'd love to be able to white-list a folder/publisher and be able to give users access to run those installers "as an admin" without giving them full admin rights.

It looks like AppLocker might be able to do this, but that doesn't work on Windows 7 Professional. Does Software Restriction Policies give me this ability, or is that purely for not allowing apps to be run? I feel like there's something I'm missing that I should be able to grant more stringent access to installers, without having to manually type in an admin password into the few computers that need a software update.

I remember reading this tip on SA several years ago. It's not mine, but I forgot who originally wrote it:

quote:

Let users run stuff as admin: Create a scheduled task with the executable running as admin, let your users run scheduled tasks on their workstations. Don't actually schedule the task to run but replace the normal launch links with links to the task. Any time the user tries to run the program it launches it with the admin credentials and bypasses UAC.

Give it a shot.

Demie
Apr 2, 2004
I'd appreciate some general advice here. We have various little problems that I think we could solve by doing things "the right way", so some nudges in the right direction would be cool.

We're working with SCCM for deployment, and we're in the process of building various base Win7 images. We use SCCM over PXEboot inside some VMs to do the actual image building. It's getting very time consuming, especially with the element of trial and error. Most of our time is spent waiting for images to finish deploying and fail, then studying all the logs to figure out what we did wrong in the TS.

It looks like everyone swears by using MDT instead of SCCM, to actually build the images that SCCM would deploy. Because you can only have one PXE server for your DHCP server, I want to know how people are setting up their test labs in this kind of situation.

We're using VMs that are hosted by Vsphere/Vcenter to manually build the images, which we then capture by mounting the MDT image capture ISO. Then we make MDT task sequences inside SCCM that would be used to deploy the images with various specializations.

We're total amateurs with VMware, networking and enterprise windows. I think maybe our best bet is to study virtual networking in VMware and build a WinServer VM with MDT and DHCP, but I'm on the fence due to the time crunch. Would USB booting do what we want in the short term? More than anything else, I really just want to know how everyone else is doing it.

Sudden Loud Noise
Feb 18, 2007

Demie posted:

I'd appreciate some general advice here. We have various little problems that I think we could solve by doing things "the right way", so some nudges in the right direction would be cool.

We're working with SCCM for deployment, and we're in the process of building various base Win7 images. We use SCCM over PXEboot inside some VMs to do the actual image building. It's getting very time consuming, especially with the element of trial and error. Most of our time is spent waiting for images to finish deploying and fail, then studying all the logs to figure out what we did wrong in the TS.
.

Why are you creating multiple images? Are the differences between them so drastic that it's better than just having multiple task sequences?

Have you tried a build and capture process based entirely in SCCM? Or is there something in your environment that you need in your image that SCCM doesn't allow for?

Sudden Loud Noise fucked around with this message at 03:50 on Aug 11, 2013

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
I don't even bother with image capture, I just build build to bare metal everytime. It doesn't really take that long in comparison to just deploying a captured image, as we don't install that many fancy apps, just the basics like Firefox, Java, Flash, Adobe Reader, Office, etc etc. When Office 2013 SP1 comes out I might make a base image with Win 7 and Office 2013 SP1 because the service pack for Office 2010 took about 20 minutes to apply and that doesn't change very often.

If I had a lot of software packages that took a long time to build I'd cook some images with just the base OS and those packages, as I'm hoping they wouldn't change very often.

Or, just take a look at how many man hours you're spending making images, and then figure out if that cost is worth having a faster deployment. Once you get your task sequence working properly a machine may take longer to build, but it costs you zero time because it's all automated.

Sudden Loud Noise
Feb 18, 2007

You pull all windows updates (since SP1) each OSD? I miss having reliable network infrastructure. :smith:

Anyone have any experience with Radia/CAE? I'm trying to give it a fair shot but man it seems really terrible compared to SCCM.

Demie
Apr 2, 2004

spidoman posted:

Why are you creating multiple images? Are the differences between them so drastic that it's better than just having multiple task sequences?

We're dealing with various public access and internal staff PCs, so yes, there are some drastic differences. I do have a goal to keep base images to a minimum, I think we can get away with three and automate the rest, but the bottom line is I have to get some base images built, which I'm not getting done very quickly.

spidoman posted:

Have you tried a build and capture process based entirely in SCCM? Or is there something in your environment that you need in your image that SCCM doesn't allow for?

That's what we're doing. It takes forever and crashes with nonsense error codes whenever I try to do something in the task sequence. Considering the steep learning curve I'm dealing with (I break task sequences every time I touch them), I am wondering if this is really the best way to do it. Various MS MVP types claim that MDT is faster and makes more compatible images (not sure how they're more compatible), so I want to know if that's what I "should" be doing. I'm really not sure, so that's why I ask.


FISHMANPET posted:

I don't even bother with image capture, I just build build to bare metal everytime. It doesn't really take that long in comparison to just deploying a captured image, as we don't install that many fancy apps, just the basics like Firefox, Java, Flash, Adobe Reader, Office, etc etc. When Office 2013 SP1 comes out I might make a base image with Win 7 and Office 2013 SP1 because the service pack for Office 2010 took about 20 minutes to apply and that doesn't change very often.

When you say bare metal, do you mean you just have the system deploy an untouched installer WIM with further instructions in the TS?

FISHMANPET posted:

If I had a lot of software packages that took a long time to build I'd cook some images with just the base OS and those packages, as I'm hoping they wouldn't change very often.

Or, just take a look at how many man hours you're spending making images, and then figure out if that cost is worth having a faster deployment. Once you get your task sequence working properly a machine may take longer to build, but it costs you zero time because it's all automated.

Understood, but we'll be doing some massive deployments in small windows, so performance is an issue. And getting some weird customizations and 3rd party apps to work just right is what I'm really trying to accomplish in these images.

Italy's Chicken
Feb 25, 2001

cs is for cheaters
Speaking of MDT, has anyone compared the deployment time between a "fully captured" and a "fully task sequenced" setup? I'm trying to speed up deployment as much as possible.

Swink
Apr 18, 2006
Left Side <--- Many Whelps
Anyone tried out Work Folders in 2012R2 preview?

By the looks it allows us to share out a users' shared folder to their personal laptop\ipad or whatever, then remotely wipe it (the data) when the staffmember loses his ipad or we just want to fire his rear end.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Any suggestions on a program that will monitor disk space on a bunch of servers?

We have this unconfigurable report that runs every week and just gives you the numbers for every server. Unsorted, every server every week. I don't care about them unless they are running low, so it just gets ignored.

I want to be able to set something like "only alert me if it's under 2GB or 10%" or something like that. I don't have time to read through a bunch of poo poo every day.

Matt Zerella
Oct 7, 2002

Norris'es are back baby. It's good again. Awoouu (fox Howl)

Bob Morales posted:

Any suggestions on a program that will monitor disk space on a bunch of servers?

We have this unconfigurable report that runs every week and just gives you the numbers for every server. Unsorted, every server every week. I don't care about them unless they are running low, so it just gets ignored.

I want to be able to set something like "only alert me if it's under 2GB or 10%" or something like that. I don't have time to read through a bunch of poo poo every day.

I use cacti with the threshold plugin. Then I create a page with all of the disk space on it so I can see everything at once, and then set thresholds with email monitoring.

It's not super simple to set up, but once it's going it's great (and agentless, just use SMTP and the base Win2k profile).

sanchez
Feb 26, 2003

Bob Morales posted:

Any suggestions on a program that will monitor disk space on a bunch of servers?

We have this unconfigurable report that runs every week and just gives you the numbers for every server. Unsorted, every server every week. I don't care about them unless they are running low, so it just gets ignored.

I want to be able to set something like "only alert me if it's under 2GB or 10%" or something like that. I don't have time to read through a bunch of poo poo every day.

It sounds like you're missing monitoring in general, as any monitoring product will do that for you. Do you get alerts if a service stops or starts throwing errors? Something like Zabbix could work.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

LmaoTheKid posted:

I use cacti with the threshold plugin. Then I create a page with all of the disk space on it so I can see everything at once, and then set thresholds with email monitoring.

It's not super simple to set up, but once it's going it's great (and agentless, just use SMTP and the base Win2k profile).

I have a cacti setup already going - that's not a bad idea.

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Italy's Chicken posted:

Speaking of MDT, has anyone compared the deployment time between a "fully captured" and a "fully task sequenced" setup? I'm trying to speed up deployment as much as possible.

I've never used MDT for a 'fully captured' image, we used Ghost for that, but in my experience, deploying a 7GB fully captured XP image took less than 15 minutes, and another 10 for our post deployment script to run. We could Ghost a workstation and have it back in production in about 30 minutes with half of that time being our post deployment script that renamed, joined, and installed our A/V software while rebooting every step of the way.

When I image a laptop using MDT, with Windows 7 SP1 with all current updates and Office 2010 ProPlus with all current updates, Lync, VPN, and MSOL Sign In, it takes about 90 minutes or so to do it. A big chunk of that time is installing the updates during the deployment. I don't mind the time though as there maintaining older monolithic images is very time consuming and something I would only recommend doing for certain scenarios. Our call center is one place where a monolithic fully setup image works well. We have 450 seats of the exact same model of computer and they all need to be configured the exact same way. Makes sense to rebuild that image when you need to. Our laptop pool though is a mishmash of E6400 to E6430's and it doesn't make sense to build and maintain 4 different laptop images when it might save you an hour at best per laptop.

Even though MDT can take up to 90 minutes or so to deploy a laptop, it's hands off once you get it going. I just stick it on a shelf next to my desk and let it do it's thing.

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams

Demie posted:

When you say bare metal, do you mean you just have the system deploy an untouched installer WIM with further instructions in the TS?

Yep, deploying the install.wim and then doing everything to that. Though SCCM 2012 has made it a lot easier to integrate patches into that install.wim, so what I'm deploying has most patches installed.

With SCCM 2007 I could never get images to work properly, I'd have to manually reinstall the SCCM client on any imaged machine for it to actually communicate with the SCCM server after an install, though I haven't tried it in 2012 yet.

Thanks Ants
May 21, 2004

#essereFerrari


I'm not sure if I've mentioned this before so apologies if I have, I really need to get around to buying search one day.

I have a user on a laptop, in a Roaming Profiles and Redirected Folders / Offline Files environment. Whenever this user logs on away from the network they get an error about their profile being unavailable, and a message about not being able to display the desktop with the path set to the UNC path of the server location that it usually lives on. What's up with this? I thought it was supposed to gracefully fail over to the Offline Cache? Is this just a rare situation where a flatten and reinstall will clear it up? I've already cleared the Offline Cache and the laptop's been connected to the network more than long enough for the cache to build itself again.

Unfortunately it's a pretty small organisation so there are no other regular laptop users. I can get access to the event logs if necessary but I need to wait for this guy to be back in the office or on the VPN.

Demie
Apr 2, 2004

FISHMANPET posted:

Yep, deploying the install.wim and then doing everything to that. Though SCCM 2012 has made it a lot easier to integrate patches into that install.wim, so what I'm deploying has most patches installed.

With SCCM 2007 I could never get images to work properly, I'd have to manually reinstall the SCCM client on any imaged machine for it to actually communicate with the SCCM server after an install, though I haven't tried it in 2012 yet.

It seems attractive, but not 100% for me. I am currently dealing with a thick image that a consultant built, which seems to have a show-stopping bug so I'll have to rebuild from scratch; I really wish he made it your way. It always installs to drive D and crashes on further captures when you correct it to drive C; but it deploys OK as-is, so that's "mission accomplished" for a consultant. It will be a pain to figure out the configurations that were in it, but it will be nice to take all the stupid browser plugins out, which should all be packages for sure. The image will be thinner, and I intend to document every step this time (yeah I know, famous last words).

When I try to work with a MDT TS from SCCM, I can't help but notice that the it has to reload the SCCM client at every pass. I'm switching to SCCM-native TSes for the basic B&C, against the advice of said MVPs, since they have much fewer steps.

Even when you create a MDT TS inside SCCM, that's actually not the same as a MDT TS in MDT, from what I'm reading. Building a dev lab on one box running VMS is looking more attractive, it looks like it's less tedous when it's not constantly re-loading the SCCM client at every pass, not to mention the idea of downloading and capturing images directly with local VMs. But I've never even used MDT by itself, so I'm on the fence on whether I should risk deadlines by investing the time up-front to figure out that setup, and I'm not sure if that's really how people are doing it.

Demie fucked around with this message at 05:31 on Aug 13, 2013

Yaos
Feb 22, 2003

She is a cat of significant gravy.
I use a single thick image made in a vm and then install per-user programs after that. Since all our programs are poo poo most can not be silently installed so it is easier to install it after I am done with an internal installer rather than MDTs install thing. I don't bother with changing anything in the task sequence other than turning off the gpo settings that break connecting to non-windows shares.

wyoak
Feb 14, 2005

a glass case of emotion

Fallen Rib
Has anyone had any luck with the Chrome Legacy Browser plugin and related group policies and templates? We're looking at rolling Chrome out, but we still use a version of MS CRM that requires IE. I've got the Chrome policies installing the plugin correctly, and it's creating the registry keys that SHOULD be telling Chrome to open certain pages in IE, but Chrome doesn't open them in the external browser.

Wasn't sure if this should go in the Chrome thread or the GPO thread but it's kind of a combination of both so I'll put it here instead....

Mr. Clark2
Sep 17, 2003

Rocco sez: Oh man, what a bummer. Woof.

Can anyone point me to a primer on Group Policy/Preferences (is there even a difference?). I figured it's about time I learn dat shizz....
*edit* Online, print, dont care about the format.

Mr. Clark2 fucked around with this message at 17:10 on Aug 13, 2013

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

This guy is generally refereed to as an expert in Group Policy.

The latest edition of his book

http://www.amazon.com/Group-Policy-Fundamentals-Security-Managed/dp/1118289404/ref=sr_1_1?s=books&ie=UTF8&qid=1376410458&sr=1-1

His website

http://www.gpanswers.com/

Docjowles
Apr 9, 2009

I've used an earlier edition of that book and it does indeed own. I also like Group Policy Central for random how-to guides once you understand the fundamentals.

Pham Nuwen
Oct 30, 2010



I work in research, typically with Linux or Unix-like OSes. One of our projects needs an Exchange server, and none of us really know how to do Windows server things, so I figured I'd ask here.

We're doing some testing with email servers. We basically need a very simple Exchange setup that will accept incoming messages via SMTP. Then, we'd like to be able to check how many messages have been received, possibly retrieving them all for analysis using something like fetchmail from a remote machine.

What do we need, at a bare minimum? Is this something that's feasible in a VM? What's licensing like? Note that this will never be used for actual email, just for generated test messages on an internal network.

Bob Morales
Aug 18, 2006


Just wear the fucking mask, Bob

I don't care how many people I probably infected with COVID-19 while refusing to wear a mask, my comfort is far more important than the health and safety of everyone around me!

Pham Nuwen posted:

I work in research, typically with Linux or Unix-like OSes. One of our projects needs an Exchange server, and none of us really know how to do Windows server things, so I figured I'd ask here.

We're doing some testing with email servers. We basically need a very simple Exchange setup that will accept incoming messages via SMTP. Then, we'd like to be able to check how many messages have been received, possibly retrieving them all for analysis using something like fetchmail from a remote machine.

What do we need, at a bare minimum? Is this something that's feasible in a VM? What's licensing like? Note that this will never be used for actual email, just for generated test messages on an internal network.

It needs to be exchange and can't just be iMail or something?

FISHMANPET
Mar 3, 2007

Sweet 'N Sour
Can't
Melt
Steel Beams
Yeah, especially if you're unix guys, setup postfix or sendmail or something and call it a day.

Pham Nuwen
Oct 30, 2010



FISHMANPET posted:

Yeah, especially if you're unix guys, setup postfix or sendmail or something and call it a day.

That's the problem, one of our goals is specifically to test against Exchange. We already have postfix set up, that was easy. Unfortunately, there's a bullet point on the list of deliverables that says "Tested with Exchange".

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

You can get 180 day trials of Server and Exchange and throw them in a VM and it should do what you need it to do.

IT Guy
Jan 12, 2010

You people drink like you don't want to live!
I have a slight security issue.

We have a user going away for surgery. I'm not sure how long they'll be gone but I'm assuming between 2 weeks to a month. Anyway, they're hiring a temp and they want to use the old users account name, email, etc. This doesn't sit well with me but I really don't have much to back up why we should just create a separate account for this temp user. In my mind, they should be a new user, they shouldn't be logging in with someone else's credentials, even if they are a temp. My help desk is a oval office who likes to take the lazy way out of everything so they're advocating to just reset the password and give the temp the old user's account. I have the final say though. Can anyone help me back this up? I'm sure there are best practices or standards out there you can point me to that I'm not aware of?

IT Guy fucked around with this message at 21:20 on Aug 13, 2013

Adbot
ADBOT LOVES YOU

skipdogg
Nov 29, 2004
Resident SRT-4 Expert

Temp account with email forwarding is what I would do. If for some reason this person is running some business critical poo poo through their personal email account and not a shared one, maybe delegate rights to the email account.

Standard practice for us is when someone is on any kind of leave is to disable all access and accounts. Never ever ever would I let someone impersonate someone like that.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply