Is there a way to load a Gzipped file from Amazon S3 into Pentaho Data Integration (Spoon)? There is a \"Text File Input\" that has a Compression attribute that supports Gzip, but this module can\'t
I have a Pentaho Kettle (PDI) transformation that starts with a Table Output step executing basic SQL. This step hops to an XML Output step that creates an XML file.
I have been attempting splitting the results of a pentaho transform into multiple files based on the value of a specific field without any luck.
I’m new to Pentaho, and I need to create a transformation that reads input from Paradox tables. We’re using a really old version of Paradox – It’s 4.5. The tables that I need to load have .db exte
I开发者_Python百科 am new to Pentaho Kettle and I am wondering what the Internal.Job.Filename.Directory is?
I need to l开发者_如何转开发oad data from a csv to database. Those tables also used by application code, and table primary keys are generated by hibernate uid.
I am trying to figure out how to create a job/transformation to uncompress and load a .tar.gz file.开发者_运维问答 Does anyone have any advice for getting this to work? you want to read a text file th
Hhi all, I\'m using kettle4.0.1 communty version, here iam comfortable with spoon, but for running jobs and all i need to use pan and carte, my problem is other than spoon.bat niether of pan.bat nor
I have two columns named amount and work with data in the format of given below: amount5822084151 659992378544开发者_Go百科 52656
can some one suggest me best idea to overcome this situation. Iam using kettle 4.1.0 community version, here开发者_JAVA百科 when i want to preview the data in spoon for the transformation table output