Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
I am using Powershell for some ETL work, reading compressed text files in and splitting them out depending on the first three characters of each line.
We have a text file with about 100,000 rows, about 50 columns per row, most of the data is pretty small (5 to 10 characters or numbers).
I am new to OLAP. I understand the table struc开发者_如何学Goture and ETL process. I don\'t understand when data is supposed to be deleted from the fact table.
We unit test most of our business logic, but are stuck on how best to test some of our large service tasks and import/export routines.For example, consider the export of payroll data from one system t
I need to load data for my Rails application from multiple providers (REST/SOAP based XML feeds) into the database on a recurring basis. I have written a set of Rake tasks which are kicked off by when
I\'m starting to get involved in quite a bit of ETL work a my current job, and everyone seems to be pretty partial to SSIS.I\'m struggling trying to do the most trivial transformations through BI stud
I have two t开发者_如何学Pythonables, records are being continuously inserted to these tables from outside source. Lets say these tables are keeping statistics of user interactions. When a user is cli
Folks, We\'re building an ETL process to load mid-size开发者_StackOverflow中文版 dimensional data warehouse using SQL Server 2005 SSIS on 64bit OS. We\'re planning to use SSIS\'s Checksum package to
I am looking for the best solution for custom file parsing for our enterprise import routines. I want to basically change one file format into a standard file format and have one routine that imports