Deploying the stuff you need to move a job is pretty much just like packaging the stuff you need to move application code assuming that you correcly put all the new items and changes in source control. This includes your SSIS package, any object creation or alteration scripts such as tables, views, UDFs, stored procs, CLRS (Never ever create database objects using the GUI if you want to deploy later), any scripts to populate tables (such as lookup tables). You may need to number the items in the deplyment folder to ensure they are run in the correct order. Usually I write a deployment document as well because some of what we are deploying will go to differnt servers (Our ssis server is is differnt from our database server).
You can also script out the job, but you will need to review and change the script for the new environment. I often find it just as easy to set up the job on the other server manaully (but I have dba rights to all servers, if you do not, you will probaly need to this).
Before SQL Server 2012 and the project deployment model, the only way to run an SSIS package from the context of SSMS was either a SQL Agent job or xp_cmdshell.
SQL Agent is calling DTEXEC.exe as it has a special parameter /X86
that is only respected when passed from SQL Agent. Otherwise, you'll want to specify the explicit path to your dtexec.exe to control whether it's a 32 vs 64 bit operation.
This option is only used by SQL Server Agent. This option is ignored if you run the dtexec utility at the command prompt.
Option A
That could be a reason why they use a batch file to kick off the SSIS packages. They want to ensure the correct dtexec is being used to run the package. Shamless self promotion on determining whether it's 32 vs 64 http://billfellows.blogspot.com/2015/02/is-my-ssis-package-running-in-32-or-64.html Also https://stackoverflow.com/a/13092260/181965
Option B
Another reason I've seen batch scripts used as they didn't use configuration (file, table, whatever) but wanted to get a different behaviour per environment. Assuming I had a ConnectionString variable in my SSIS package and the design time value was Dev but now I'm in production, one way to change that value is to always set the property when we run, psuedocode is
dtexec.exe /file MyPackage /SET ConnectionString;Prod
If I'm lazy, I could use the same job definition of across all my environments but change the contents of the /SET
value as need be. I don't see it being that useful but some people view tasks with varying degrees of difficulty.
The above doesn't have to be just for /SET. They can set logging or anything else up really. Maybe they want to log the output to an execution log because they didn't know SSIS supports that natively. There might be /REP EWIP > CurrentRun.txt
in the dev batch file but production only have /REP E
Option C
Another reason would be the illusion of security. At a former employer, they used the batch script approach to run SSIS packages. Developers were not granted access to SQL Agent. xp_cmdshell - oh hell no to that. Nor could we be trusted to log onto the pre-production boxes (Load, stage, UAT) but we still had the need to run SSIS packages on demand.
So, after a few weeks of submitting urgent request tickets every time the packages need run, they created a pair of stored procedures that provided a gated package execution experience. RunSSISPackage and RunSSISPackage32 which would take the name of the package to be run. For every package we created, they would create a batch script that ran it. Lots of busy work for all parties but at the time it was the only way they were letting us run packages. The procedures used xp_cmdshell and EXECUTE AS to allow us to run the packages as they would be - with full sysadmin rights. By the way, when you allow people to deploy SSIS packages without auditing, they could add an Execute SQL Task in there that adds the developer group into the sysadmin role. I think. Maybe. Hypothetically.
Option D
Not SQL Agent. I've seen other scheduling tools used at clients that weren't SQL Agent. The support for SSIS specifically has been spotty but they've all supported batch files so perhaps you have some Enterprise scheduler that wasn't compatible with SSIS.
Otherwise, if it's just dtexec.exe /file foo.dtsx ... I got nothing
Best Answer
It would be a tough call between a CLR assembly and using SSIS given that your current migration script is just a script.
CLR Assembly:
SSIS Script Component:
While I haven't tested, I'll go out on a limb and say that the performance of both methods should be similar. If performance is a big concern, set up both scenarios and comparison test. In fact, you may want to set up both anyway just for practice, so you can see some of the differences and similarities for yourself, which is way more interesting and fun than reading about it (well, at least I think so).
A third option would be to move the process entirely to SSIS, which is more suited to migration/ETL types of tasks. This may or may not be feasible. You wouldn't necessarily need to retool the whole process, as you can, of course, execute arbitrary scripts.
Finally, another option is to convert the code to T-SQL. Depending on the complexity, though, it may be quite a challenge. Certainly there are manipulation tasks much better suited to using a proper programming language/framework like .NET. Without seeing the code itself, it's impossible to say if it would even be worth an attempt. As I mentioned previously, you would definitely need to develop a comprehensive test suite in this case.
Personally, I would probably create a CLR assembly because I'm strong on the programming side of things and it would be the fastest, least intrusive way to integrate the code. But that's me, and I haven't seen the code, so it may not be the best solution for you and your environment.