Fog Creek Software
Discussion Board




Should programmers admin their workstations?

In the Teaching Laziness thread http://discuss.fogcreek.com/joelonsoftware/default.asp?cmd=show&ixPost=4579&ixReplies=22
I said that developers should not have permission to administrate their workstations. “a programmer” replied that he would quit if they did not have control over their machine.
If he worked for me I would say goodbye and have a nice life. Letting someone change their computer whenever and however they want introduces random variables. I personally don’t need that headache.
Any IT person would say that you couldn’t let users change or install anything they want. Are programmers special? Are we better or worse than the general computer user?
Programmers, is control of your workstation worth quitting over? Managers, what is your stand? Has anyone ever actually quit over this?
Please, give concrete examples and stories both good and bad. I'm talking about a situation with more than 3-4 people working together.

Doug Withau
Wednesday, March 13, 2002

At the company where I work, one of the departments wanted to use something like Microsoft SMS to create standard workstation disk images, allowing admins to "push" new apps or upgrade anti-virus software, for example. Needless to say, the programmers in that department were pissed.

Rather than reducing the number of variables, the programmers' computer configurations would be changing beneath them, possibly creating app conflicts. Plus, the IT department would then need to support and maintain the highly evolving and overly complicated build tools and configurations. Each programmer has their their own favorites tools or settings, which improves their productivity.

Why do you insist the programmers NOT have control over their computers? I can't think of any reasons that would improve individual programmer productivity.

Banana Fred
Wednesday, March 13, 2002

I really think that with all the things you have to keep at the top of your head and the tools you need to develop sophisticated applications, there is an actual need to install utilities, reference guides, etc.
In my machine, I have winzip, a sniffer, two java decompilers (sometimes one will make the job when the other won't), a news reader, a graphical FTP, a custom telnet that will do ssh, a graphical utility (to measure, crop and view images), gnu diff, vim, a utility for global replace of strings over directories, etc, etc; I wouldn't live without most of these gadgets and not many of the other programmers here use them all.

But when I used to do C++ Windows programming, the mixture was even worse.  So, I would say that is a bad thing to forbid administration permissions to programmers.

I still can't think of a situation where you'd find REAL troubles if you'd let your programmers configure their machines. I can only guess you mean denying authorization on them upgrading their programming tools: that can cause a real mess, but it is not a very likely situation if you have experienced programmers.

Dario Vasconcelos
Wednesday, March 13, 2002

Doug Withau wrote:
"Should programmers admin their workstations?"

You don't thrust your programmers to admin their own computers. Yet, you do thrust them to design your applications and hack your source code? Strange...

I've seen too many situations where programmers had to wait hours for the administrator to show up from lunch to make a trivial change. Or the guy went home early... It's inefficient. If your administrators are available 24x7x365 that's a little less inefficient, but still inefficient.

The psychological part is worse: Not many programmer will be happy campers, knowing that you don't thrust them with something that most of them consider trivial. Having fun and feeling in control keeps the moral higher. And high moral is what gives you high productivity and quality code.

Sure tell that you don't like them to mess up their computers. But I would say: any programmer that regularly causes trouble by messing up his computer, should not be a programmer at all.

Jan Derk
Wednesday, March 13, 2002

I do believe that in general programmers are much better at computer admin/maintenance than other users.  I would hope so.

Sometimes showing a bit of trust can go a long way in relationships within a team.  I'm assuming everyone on your team are adults.  Treat them like it.  If they can't handle the responsibility, then it's a problem. 

rich
Wednesday, March 13, 2002

Okay, this is an effective troll and I'll take the bait.

I suppose the question is only meaningful if you know whether you're talking about 'programmers' or 'engineers'.  A 'programmer' being a soulless automaton whos job it is to append source lines onto a 15 year old COBOL file or change fixed constants when business rules change.  Okay, lock his workstation before he hurts himself.

An engineer, on the other hand, has the job of creating art on the canvas on bytes.  Dramatic?  Sure, but accurate.  This person may hurt themselves, but stands to benefit from an open computer.  It's narrow minded and egotistical to assume that the manager can anticipate every utility and application that would be useful to a given person now and in the future.  Or even evaluate whether a given program has value.

This kind of thinking goes right along with "you must wear a blue shirt and kacki pants", "no personal effects on the desk", "no more than 15 lines in a function", etc.  This thinking works when you want 100 people to do the work of 10 (big company), not when you want 10 people to do the work of 100 (small company).  It all comes down to cultural precident.

You're correct to tightly control which libraries/DLLs get linked in, which APIs are called, and which technologies are used.  But we're not laying brick here.  A bucket of morter and a fancy spatula are going to cut it for some, but not others.

Bill Carlson
Wednesday, March 13, 2002

Is this a troll? I have never worked in a situation where developers did not have control over their own workstations.

chunks
Wednesday, March 13, 2002

There can be so many variations in system installations that it is hard to respond to this.

In most of my recent work environments I had a Windows NT workstation.  As long as I could install some extra tools that I wanted, such as XEmacs, I didn't have any problem with sys admins working on the basic OS stuff.  With my current PC I have the admin password and can do whatever I want to.

On one assignment we had HP-UX workstations administered by an IT group.  For the most part we had the tools we needed and just let IT worry about admin.  However, the sys admin group wanted the workstation placed on the desk tops.  From their POV it was a good idea because it kept the machines out of dust on the floor.  The developers protested because it took up precious desk space and was noisy.

Is it worth quitting over?  Maybe in a better job market.  As a developer I prefer to work in an environment that is supportive.  If IT is hostile to users that is probably a symptom of a broader problem with the company culture.  I'd want to be somewhere else.

mackinac
Wednesday, March 13, 2002

The admin where I work and I have a good understanding. It goes like this

"You do what you like on your PC, but if it breaks, you fix it"

That works fine by me. Part of my job is evaluating new tools. If I had to call the admin over every time I wanted to install something, I would go nuts. I wouldn't be able to do my job properly, and the admin wouldn't be able to do his, because he would always be over with me being hassled about some thing I wanted to install.

In the end, everyone would be unhappy. And yeah, in a workplace like that, eventually I probably would quit.

It's like Joel's example in one of the articles about an admin hassling him to stop coding and clean up some space on the server, it is a waste of valuable time.

Damian

Damian
Wednesday, March 13, 2002

It depends. My first job was at Nortel writing telecomms code on a mainframe (which was shared with everybody else, naturally ... it was *fast* after 17:30): there I had no super permissions, used their standard tools, glass terminal, no problem at all, very tidy and professional.

Here we now use PCs, and everyone administers their own ... except for the 'test' machines, the version control software (VCS) server, and a few other servers, which are actually administered by someone or some few of us when needed but which aren't just randomly hacked up by any/everyone.

IT doesn't support our machines, they wouldn't even if we wanted them to: they support some standard configuration with email installed on it and not much else, not developer tools.

I will insist on a standard build environment, for anyone who builds non-prototype software (which includes all developers): a specific compiler version and standard libraries, for example. I also insist that anything be buildable on any machine that has the standard build/tools environment installed: what this means is that if you're using *.LIB and *.H and *.TLB and all of that, these must be in the VCS. Special things (device drivers) have explicit "how to install the build environment" instructions. I doubt that I need to relate any war-stories that might illustrate why this policy seems sensible to me; people are reminded "why" occasionally if someone temporarily breaks this rule.

Some people are early adopters, and some late; e.g. some still use NT v4 on their workstations, and some use XP; and various Solaris versions. A benefit of this is early (pre-formal) testing of the product, another is just to keep some people happy, also people choose their own source browsers and what have you. I don't know disadvantages. Some people (especially but not only testers) use Ghost or VM to run multiple O/Ses on one machine.

To save money, we don't use standardized hardware; e.g. we don't throw away all computers whenever we buy one newer machine. Machines are hand-me-downed (everyone tends to have two, a more and a less powerful one, for testing our client/server software). Some but not all machines are expensive (multi-CPU NT, and Solaris), with the most expensive ones being in the test room only and administered by the testers.

We had a couple of "let's stop and look for all of the installed IIS servers in here" when a virus got inside our corporate firewall ... on three occasions. Now our test machines which must run IIS constantly are usually on a disconnected LAN, and someone I think is in charge of knowing about the remaining IIS servers and keeping them patched.

Oh, people tend to be administrators of their local machine only, not of the whole domain; and on Unix we all know the root password (except of the VCS server which is administered by corporate people off-site, for which I'm thankful) and use it occasionally when needed (e.g. to install stuff) but we tend to develope and run our software there as normal users.

IT administer some stuff: the corporate firewall and VPNs, the LAN cabling, the phone system, the email (client and server), the VCS (client and server, but not the contents of the VCS), corporate intranet and internet stuff; but they don't know our product and aren't competent to administer how we develope it.

I can conceive of other environments that might be simpler or stricter (mainframe, simple Web, national security).

We're an autonomous group of 20-60 people in a company of 6000. If OTOH we were smaller (8) or larger (200) then someone might (I don't know) be able to fully administer all machines, on either a part-time or a full-time basis respectively.

> The nice thing about open source projects is that you don't have to accept code from people you don't trust, and you get to select who has CVS access...

I find exactly that to be a nice thing about a closed source project too. It's up to us who we hire (or fire, or find another job for, if we find that we can't work with them), we can do peer reviews of new design and code before "accepting" it ...

anon
Wednesday, March 13, 2002

I don't have root on my machine and frankly I can't think of any reason that I should have it.  If I need to install a tool, I install it locally.

Now, Windows NT is broken in this regard.  Most things cannot be installed "locally" in the sense that they can on a *nix box.  Installation in general is an admin-only operation.

If I was a Windows guy, I would insist on having admin.

As a *nix guy, I don't need it and it's probably better that I dont' have it.

Bob

Robert W. Anderson
Wednesday, March 13, 2002

As a man who works for a company that never met a server it didn't like be dbms,app or web the solution is pretty simply:

1. TEACH YOUR PROGRAMMERS TO USE GHOST
2. Give them a nice large networked hard-drive to store  their images

END of Problem

Anon
Wednesday, March 13, 2002

If you asked somebody to dig a hole for you and you wanted it done quickly, would you tie one hand behind their back?

Yes, the hole can still be dug.

This is the same thing with admin rights for developers.

Tony
Wednesday, March 13, 2002

I reread my response, and maybe it seemed confrontational.  Let me clarify.

- Code developed on different kinds of machines can be more stable.  These defects that don't repeat on other machines are more interesting.

- Developers need a messy machine that allow them to test the quality of new libraries and experiment with apps.  In addition, they need a clean machine that is reliable.

- Few things have less quality than an ambitious new app your team is writing.  A braindead little error may end up deleting your root directory.  So anything else you run is likely to have more quality.

- Developers working on multiple platforms induces the company to use the most stable technologies that work well on different machines.  Some companies standardize on Outlook, and it takes forever to support simpler unix and web email interfaces.

a. programmer
Wednesday, March 13, 2002

I think programmers should have admin rights (assuming a Win NT environment). One of our guys is a great CGI guy but knows little about Oracle. He learned Oracle enough to interface with it buy using Toad. He never would've discovered Toad if he couldn't install it on his computer.

Programmers are constantly looking for tools to help. Programmers are constantly solving *new* problems and therefore constantly looking for new tools.

my 2cents.

Mark W
Thursday, March 14, 2002

The "official" rule at my last project was, no loading of any unauthorized software on any company computers. But this interpreted as a way to prevent having one of the less-than-brilliant developers repeatedly screw up their desktops via unsupported installations, and wind up costing the company in support dollars to reopair. I maintained a full suite of "unauthorized" applications on my workstation and was far more productive than the handful of coworkers who actually followed this rule and somtimes spent over a day waiting for desktop support to make their way over.

My interpretation of the rule: if you are fired for instaling unauthorized apps, you probably needed to be fired anyway.

Boomhauer
Thursday, March 14, 2002

You know, if you want your developers to produce code - you have to avoid everything that prevents this. If developer want to be in control of their workstation - give it to them, If not - supply an IT person to manage.
If your goal is not to produce a code - you can do whatever you want.

Roman Eremin
Thursday, March 14, 2002

If the job market were very buoyant and a 'do not touch' rule was applied to my PC, yes, I'd seriously consider quitting. That sort of attitude would imply a lot of other crap was in place.

There are any number of day to day problems that are far far easier for me to deal with when I have admin rights on this machine, and can install the software I want when I want rather than waiting weeks for IT support to get off their arses and deal with it.

If I found out such a policy was in place at an interview, the interview would end pretty quickly! Similarly I once interviewed with a company that stated they didn't allow staff to 'use the internet' except after work hours. Any company that so misunderstands developers that it disallows them access to their number one resource (Google Groups!), is clearly not worth working for.

DB
Thursday, March 14, 2002

Yeah, there's nothing like saying "Hey, I could solve this problem in 30 seconds if I could install <blah> and then waiting 2 hours for tech support to arrive, only to find out that they're not authorised to install <blah> because <blah> is still making its long trek through "approval committee" by a bunch of people that use computers for email and writing memos and nothing else.

Because the same sort of company that usually has the "no admin for developers" policy usually has "no perl because it was written by hippies and communists" policies as well.

Katie Lucas
Thursday, March 14, 2002

At my former employer, we had a "no admin for developers policy" done right. That was at a small company where we
were some 30 developers. When I came to work there I
got a machine which I installed whatever I needed,  stuff
you would never find on "corporate installed machines".

This install was then ghosted, and stored on a CD on a per
person basis, so if my machine got flaky, some tech-support
guy ordered a new machine, and ghosted back "my" machine config to the new machine. This is very timesaving.

At my current place it is done wrong, as per the post above.

So it is really a matter of how it is done, if it is timesaving or
just an obstacle to get the job done.

Patrik
Thursday, March 14, 2002

Boomhauer wrote:
"The "official" rule at my last project was, no loading of any unauthorized software on any company computers. But this interpreted as a way to prevent having one of the less-than-brilliant developers repeatedly screw up their desktops via unsupported installations, and wind up costing the company in support dollars to repair."


Traditional case of management incompetence. Punish/restrict all, because one can't handle himself.

I often asked myself why they do this. My conclusion was that bad managers follow the way of the least resistance. They find it scary to confront/fire incompetent employees. It's much easier to fire off a memo with restrictions for everyone.

Jan Derk
Thursday, March 14, 2002


Check out Joel's new book list ... the book on Despair, Desolation, and Depression.  Hmm .... people get depressed when they feel they don't have control of thier environment ... hmm.

'nuff said.

MatteyBoy
Thursday, March 14, 2002

Great responses, thanks.
I was trolling a little. I wrote the questions to try and get responses. We do not completely lock down workstations where I work. From the responses, I think we restrict things more than in most development shops. Let me respond and give some background reasons for why I think we are doing things right.
First, I did not think the response to my original post on the Laziness thread was confrontational. I though this must be a hot button for people if someone is willing to quit, or even threaten.
People have said that they know what tools they need better than management. Probably true. If your managers don’t have a good idea of what tools are required, maybe you should suggest hiring some technical managers.
Everyone here has all the tools, editors, and gadgets they need. It is rare to have a request refused for some new tool. The tool must be justified. If it is just cool and advertised as the next best thing, I don’t want people installing it without some controls. If we believe a tool is useful, we test it on one computer. If it is worthwhile, we push it to everyone. The idea is to avoid having things passed around that are just the flavor of the month. This helps avoid mocking bird development (Laziness thread).
It is also very nice to know who has changed something on their machine. When Joe’s code won’t build on Bob’s machine, is it a real bug, or has Bob installed thingy X?
We do not have cubicle police patrolling. We do embedded development with about 25 engineers. This is a very nice, laidback environment. There are times you have to wait to get something fixed or installed. I bitch about it too. That is what programmers do.
We also have a non-developer dedicated to keeping the PC’s running, doing backups and general system maintenance. If your main business is not software, I do not think your IT department will be able to handle our special needs.
In the original post, Bill Carlson ask what should he do when starting a new, large project. I said take control of things that affect your project wherever you can. This is something that can be controlled, so do it. If I never have to waste a day while an engineer reinstalls the OS, I’m going to do it.

Doug Withau
Thursday, March 14, 2002

Jan Derk wrote:

"Traditional case of management incompetence. Punish/restrict all, because one can't handle himself.

I often asked myself why they do this. My conclusion was that bad managers follow the way of the least resistance. They find it scary to confront/fire incompetent employees. It's much easier to fire off a memo with restrictions for everyone. "

I consider it much more likely that it's a CYA move to forestall litigation ("They're _discriminating_ against me!").

In the US these days, you have to maintain such a long and detailed paper trail these days in order to defend wrongful-termination lawsuits that I don't think it's a matter of incompetence (although it could be). If they don't already have a significant paper trail (specific incidents, times, dates, other people involved, etc.) on that individual, or if they don't consider staff resentment caused by the policy to outweigh the costs of establishing such a paper trail, then a blanket policy is not only the path of least resistance, but also the sensible thing to do.

Steve Wheeler
Thursday, March 14, 2002

Steve Wheeler wrote:
"I consider it much more likely that it's a CYA move to forestall litigation."

You got a valid point there. Drastic measures like firing a person should always be based on facts. And it's good that employees are protected against being fired just because the manager had an argument with his wife. However, restricting every developer because one individual can't handle the responsibility is IMHO incompetence.

In this specific example I would probably lockup the computer of this individual, not of the others. This might get the message through. If it doesn't, fire I'd fire him. And yes having some black on white proof is probably a good thing. Letting the fear for paperwork control your actions is a recipe for disaster.

Steve McConnell said it well in Code Complete:
"If you tolerate even one developer whom the other developers think is a problem, you'll hurt the morale of the good developers." and "Some managers say that they have never regretted firing anyone. They've only regretted not doing it sooner."

Jan Derk
Thursday, March 14, 2002

I wrote:
"Steve McConnell said it well in Code Complete:"

Make that
"Steve McConnell said it well in Rapid Development:"

I love the [cough] edit functionality in this forum.

Jan Derk
Thursday, March 14, 2002

In general, I have found it works very well to have developers be local admins on their boxes (w2k environment currently), and to have the senior-most developer on a project (current company calls this person a "TA" for "Technical Architect") be the admin on any project-specific boxes, such as the db server, web server, etc. On some projects, I've been admin on the QA box, but when I'm not, I haven't had any problems going to the TA to get done what I needed to.

Most of our developers are running w2k server on their local boxes.

Oh - BTW, just to clarify - currently working in a project/services environment, instead of a product/shrinkwrap environment. So, each project normally gets its own dev and qa environments (at least) set up, then torn down and repurposed for the next effort.

Anyway, much like several previous posters mentioned, our sysadmin has several types of ghost images for our boxes, and if anyone f**ks-up their box too badly, it's just a few minutes' work to suck the ghost image down and they have a fresh, clean box. Sometimes, being "winders" boxes, we'll just ask the SA to lay down a new image for us anyway just to have a clean box.

Most folks who want them here have at least two machines, with the second one being a *nix box. Nobody's said anything "officially" but it's kind of like "if you can't handle being root for yourself, then you shouldn't have it", so folks handle their own *nix boxes themselves. This is sort of informally enforced by the developers themselves, which is kind of nice. Personally (and politically, but that's grist for another discussion group), I believe in self-regulation as much as possible. Standardize where it's truly important (*operationally important* like compilers, libraries, version control tools and defect tracking tools, not "religiously" important, such as which code editor to use).

Our SA's a great guy -- one of the very best I've ever worked with (and NO, you cannot have him!! ;-). He's got a set of images, like I said - one for our PM, one for our sales folks, etc. I built one QA box the way I wanted, he ghosted it for me (he maintains the images for us), it's easy to re-image any of my boxes in the lab. This makes sure that I have our *important* standardized elements in place -- antivirus and network config. I'm local admin on my personal box, I'm root on my linux box in the lab, and I'm local admin on all the QA boxes. The SA is the domain admin, and there's a backup among our most senior tech folks who can get into everything in the server room if need be.

Folks can pretty much install what they want, and within reason, the SA will help out where he can, but most folks handle what they need to without problems. This is not a strict shop in terms of such rules and procedures, but for this group of people at our current size, it's working very well. All we really have to watch out for is to make sure nobody screws up on any licensing issues. So far, we've stayed on the sunny side of that problem.

We've got about a dozen developers of one flavor or another, just to give you an idea of the size for which this system is working.

My testing environments are all done using VMWare, and without this turning into a commercial for VMWare, I cannot recommend it highly enough. It paid for itself the first project I had to reconfigure the lab for, and continues to do so each time. The nice thing is that it leaves the underlying box untouched (well very nearly - just looks like another app and a *big* data file to w2k). Of course, I admin all the virtual machines. I pretty much have to as the QA Manager. (BTW - feel free to email me if you have any questions about VMWare - I really am an advocate).

It was very different at a previous job. We basically did shrinkwrap software, going out to global customers. I was the Director of Engineering Services and owned CM, QA, Lab Services, Documentation, Beta Testing and Deployment Engineering. The Director of Engineering owned the 60 or so developers. I had about 35 people. That environment required a much more carefully regulated environment, of course. I believe our NT developers were local admins, don't know about our AIX developers, though the skill level and responsiveness of the "MIS" department that actually owned all the company boxes (except for those in my labs) was fairly poor, so I suspect that our AIX devlopers were also root on their boxes, else nothing would have ever gotten coded. Prime example - MIS "owned" the box on which our VSS repository lived, they adminstered the accounts, permissions, etc. That all sat hidden away in the big, "secret" MIS department. One day, our entire friggin' staff of 60 developers were basically idled because nobody could check any code in or out - MIS had let the VSS disk completely fill up without having taken any actions to head it off. We couldn't see the disks themselves, so we couldn't know. Hell, we didn't even know how big a disk it was. So with 60 developers sitting around reading books or playing with local copies of stuff, they were busy ripping apart a RAID set to get another disk online. Un-effing-believeable.

The Director of Engineering and I both started our jobs there at about the same time, so we did a lot of house-cleaning. We were working very hard to try to get everyone onto a standard set of libraries, and we did discover that our developers were using three or four different versions of MS VC++, which we also tried to correct. This was something we considered to be an operationally important standard to enforce.

Fortunately, my CM Manager had his act together and had done a good job of at least setting up pristine build environments for everything where at least there we knew what the hell everything was getting built with. To support this, we required that everything that went to QA got rebuilt by CM in the build environment first, specifically to flush out refs to people's home-brew versions of libraries located only on their boxes and other random noise.

My bottom line for larger, more complicated, and more formal environment like that one is that there was still a pretty good amount of autonomy for developers locally, but we filtered out the bad effects of that with our CM process.

There were a *lot* of problems in that organization, and there was a lot more work to do there, but this piece of the process was working pretty well when I left - we could reliably build our code, track our branches, and burn the gold disks repeatedly (for anyone who might not know the term, in our case the "gold disks" had to go the folks who did the assembly of our hardware to be burned to all the drives before it went out, so these had to be clean copies of good builds).

When I've first arrived in some smaller shops, or when I've been a consultant and first showed up at the client site, I've often not started with local admin rights, but quickly schmoozed the SA into giving them to me. In general, my experience has been that with big "MIS" departments or whatever they might be called, they have simply not been responsive enough to effectively support the development process -- either for the developers or for QA. Guess part of that may have come from the seemingly inevitable turf battles that arose between the IT-support staff and the developers. Don't know - your mileage may vary.

F.J. Weiland
Thursday, March 14, 2002

I'm just posting to vent.

Our PC's were recently moved from one set of cubicles to another... About 50 feet.

I turn on my computer in the morning and nothing. I get the Windows NT startup screen and it won't go any farther than that.

The tech worked for literally 3 days to fix it. It turned out that they plugged the Zip Drive into the wrong port and that caused some sort of hardware conflict.

This was around 2 weeks ago. Today I wanted to burn a CD with the latest version of CityDesk, and guess what. The computer isn't recognizing my cd-writer, and I don't have the rights to go in and see if there are any conflicts or change the drive letter. I went and spoke to another of the tech support guys, but the guy that worked on my computer was out, so I have to wait for someone to 'get around to' fixing it.

I think you have to take into consideration the technical abilities of the person working on the machine day to day and the administrator. I would've figured out the zip drive problem within an hour. "They just moved my computer, what could be wrong? Perhaps they plugged something in in the wrong place."

This CD-ROM thing I could probably fix just as easily if I had the rights to do so.

Mark W
Friday, March 15, 2002

Well I've seen a lot of posts from programmers on this subject, how about some thoughts from a network admin on the subject?

The idea that programmers know how to admin their own machines better than most people is a falicy. Some do, some do not.  If someone actually needs to hear my war stories to believe that, let me know.

But making programmers dance to my schedule would be unproductive. You all know that, and if you didn't before you must do now you've read enough of this topic to come to this post!

So what does work for me? Well a standard "base" install image that i can return a broken machine to if it's completely hosed, and a stipulation that other than that, it's a case of "u breakum u fixum".

If you wanted to take a second base install with your own programming tools and geegaws installed  that would be cool with me too, i'd help with that. I know i couldn't run my network without my load of "non standard" admin tools and geegaws, i think its unreasonable to give the programmers less leeway than the admins.

robert moir
Friday, March 15, 2002

I'm not saying programmers know how to admin their own machines, but if your programmers do...

Our company is huge. Large enough for one branch to be able to charge another branch for service. Basically the policy is this:

If you don't have admin rights and your machine breaks and you require a complete rebuild, it's free.

If you have admin rights and you need a rebuild, it'll cost you $2,000.

My case was different - they obviously 'broke' my machine during the move, so I didn't have to pay. The leeway to evaluate is a good thing.

Mark W
Friday, March 15, 2002

If my SysAdmin know how to create 10 times bigger mouse cursor on X, i'll gladly let him administer my machine.

if he can undo that i did to my machine, without reinstalling everything, i'll let him administer my machine.

If my sysadmin can just set the anti-virus NOT to scan the zip file, when i rename is, i'll let him administer my machine.

If he doesn't run to me once in a week to ask me how to do this and that, i'll let him admnister my machine.

Vardhan Varma
Wednesday, March 20, 2002


MY computer, MY world. I'm not going to tell any manager what he can put over his desk.

Leonardo Herrera
Thursday, March 21, 2002

It's not your computer, Leonardo, it's your employer's.

Personally, I don't think either blindly letting some people admin their machines, or blindly stopping them from doing so is a good idea. It should be obvious that some people need great access to their machine configuration than others, and it should be obvious that some standards have to be set and maintained regardless of who has access to what.

robert moir
Thursday, March 21, 2002

*  Recent Topics

*  Fog Creek Home