The two processors Seem to be:
http://ark.intel.com/Product.aspx?id=50067
http://ark.intel.com/Product.aspx?id=52219
and the differences are more than just clock speed - the faster one also has faster memory bandwidth, along with plain old 'more features' such as VT-d Virtualization for directed IO, Execute disable, Quicksync video, wireless display, mywifi, 4G wimax... but what it means for those things to have built in support in the CPU, I don't know.
I'd predict that the memory bandwidth, which is about 20% higher in the faster chip, would have more of an observable effect than the 10% faster CPU, but that unless you have an intensive use planned the difference wont be worth much worrying about.
Far better to look at an SSD instead of a normal hard disk, that will provide a big shift in the feel of how responsive everything is - the hard disk is the biggest bottleneck in normal computers these days. (Where a hard disk can shift 20-40Mb/s sustained, an SSD can shift 100-200Mb/s sustained. Where a hard disk can handle 100 operations per second, an SSD can handle many hundreds or a few thousands).
For which types of applications will the different graphics cards have an observable effect in performance?
- Graphics heavy games, racing and running around shooting, flying and the like (not cards, dice, board, web/flash games, etc).
- Graphics heavy apps like architecture modelling, 3D scene rendering, Pixar style film rendering.
- Currently niche apps which make use of the graphics card as a spare processor - at the moment this means things like distributed computing project SETI@Home, and PowerDirector 7 video encoding software. However, there is a push in the industry to make this more widespread, but that's probably still too far away from every day uses to bother about for another year or three.
My vote is that unless you have a particular intensive workkload or unusual use which you haven't mentioned, the 2.0Ghz will be fine, and if you can spare the money then see if you can find a machine with a good SSD to compare, and consider one as an upgrade, for an everyday snappiness boost. (Apple supplied, or aftermarket).
The processor used in the early 2011 15" and 17" i7 models is the 2.2 GHz quad-core (2720QM) Intel Core i7 Sandy Bridge with 6 MB on-chip L3 cache, so they have 4 cores and 8 threads.
Apple's MacBook Pro performance page makes it clear that Hyper-Threading is now standard on all MacBook Pro laptops.
Best Answer
I work as the Mac admin for a large non profit. At the moment I oversee roughly 200~ or so Macs, both in our main office and deployed across the world. The Mac influx has only been a thing for the past 3-4 years, before that they were largely windows based and as such are deeply intrenched in MS Exchange and Active Directory. Now that Macs make up about 35-40% of the user base, and are growing rapidly, that may change. However, this is still the environment. This is how I deal with the questions you posed...
Configuring and deploying of new/existing Macs: This is largely handled by DeployStudio. I have scripts and a workflow in place that mostly automate the process of installing all the base software (MS Office, Citrix Receiver, etc), custom user settings (user template, dock settings, Finder preferences, etc), base admin account creation, printer drivers, etc. This is the workflow I apply to all new Macs that come in the door, it is done without first wiping and imaging the Mac, a process referred to as thin imaging. For existing Macs that need to be redeployed I will reimage them using an image that has been created from comparable new Macs that have new software (just pulled 10.9.2 from a MacBook Pro Retina for example) and then apply the same workflow I mentioned before. New Macs take about 5 minutes to complete from unboxing to ready to deploy, re-deployed Macs take about 30 minutes due to the larger image. All of this is done via net boot and can be done anywhere in our 6 story building. Profile Manager is later used to deploy different settings and whatnot.
Install and manage software of Macs that are in active use: This is mostly handled through Apple Remote Desktop (ARD). You can use free tools such as PackageMaker, IceBerg, etc to create a custom installation package (if needed) and then remotely install that to either a single Mac or groups of Macs using ARD. I believe ARD costs about $80 but in my experience it pays for itself rather quickly. It has lots of management tools built in, allowing you to VNC control Macs, send scripts, install packages, etc. I have a second install of ARD on our main Mac server, which I use as a Task Server. I can set up installations on that server and as Macs come online it pushes out the installations to them, which is great since we have lots of staff coming and going at all times. Another popular tool for this is Munki, which I don't have much experience with but am starting to look at. Most Mac admins tend to swear by it.
Install software updates from Apple & third-party developers: I use the built in Software Update Serve (SUS) and Caching Server provided by Mac OS X Server. The SUS allows you to serve out and manage (blocking/allowing updated) software updates from your local Mac server. The drawback to this is it can only serve out updates for systems matching it's OS version and older, Macs need to be configured to look for it specifically, etc. Caching Server is proving to be a better fit. It requires to configuration on the client end, caches and serves out Mac App Store, iOS App Store, iBooks and Apple software updates. Its only downside is that you can't block certain updates. This is another area where a lot of people use Munki. For third party updates (usually Flash, Java, etc) I'll create an installer package then push it out via ARD.
Provide secure login and user experience management on each Mac: As I mentioned we use Active Directory (AD) for the majority of our user account control. I also bind all of our Macs to the Open Directory (OD) on our main Mac server, which allows for more Mac centric controls. Beyond that each Mac has a local administrator account, which is what we use for connecting via ARD and whatnot. New users sign in at the login screen, it checks Active Directory, pulls their info and creates a Mobile Account, which allows them to still login when their outside our network.
Inventory or asset management: Right now we use Spiceworks, free help desk software, that has a lot of this built in. It scans the network on a user set schedule and adds new Macs/devices as they are found. It's built more for Windows / AD use but it is ok with Macs. We're currently exploring other options.
Additional - If I were you I would seek to make use of OS X Server. It has relatively meager system requirements so you could likely find and use an old iMac or Mac mini for relatively cheap. The Server app itself is $20, ARD is $80. All in all it's pretty cost effective to setup a Mac server environment using their tools. Once setup you can do the majority of things that larger software/management tools, like JAMF Casper and whatnot, offer for much greater costs. It's likely not as smooth but it will still save you lots of time.
Hope that helps!