Jump to content
We've recently updated our Privacy Statement, available here ×

CrazyBird

Members
  • Posts

    15
  • Joined

  • Last visited

 Content Type 

Profiles

Forum

Events

Featured Visualizations

Knowledge Base

Documentation (PDF Downloads)

Blog

Documentation (Test Area)

Documentation

Dr. Jaspersoft Webinar Series

Downloads

Everything posted by CrazyBird

  1. Hello! I wonder if it is possible to make multiple inserts into a oracle database within in ETL job? I am not able to use tOracleConnect, tOracleCommit/tOracleRollback because I need database connects via the "commit every" option which is not available when using these components. I already tried multiple inserts, put this ended in the fact that these inserts where blocking each other, so the job stucks at the point when it tries to insert data into the second database table. I would really be pleased about some help. Thank you in advance. Stefan
  2. Well I now know the cause of this problem: every tOracleOutput -component - if not used with tOracleConnection and tOracleCommit/tOracleRollback - opens a new database connection and with that also a new transaction. In my case that means, that the first tOracleOutput opens a connection to insert a row into a table, but does not close the transaction, than the second tOracleOutput- component tries to insert another row into another table, but this transaction is blocked by the first one. So this way is also not the right one for me. What I would need is ONE database transaction over 5 tOracleOutput- statements, and that for each row from a tFileInputDelimited - component. Are there ways to do so? Thanks in adavance, Stefan
  3. I'm processing a CSV- file and I want to create Database entries into different tables from these entries. Per CSV- line there should be exactly ONE database transaction. Problem is now, that I tried to use tOracleOutput components connected to the tMap- component. Now, the transformation will never stop (no error, no exit of the program). I tried to debug the code and saw, that the jobs "hangs" when trying to execute THIS stament: insertedCount_tOracleOutput_2 = insertedCount_tOracleOutput_2 + pstmt_tOracleOutput_2.executeUpdate(); which is part of the secont tOracleOutput - component - see attached graph. If I use tOracleConnectiona and tOracleOutput the INSERT works but than the commit would take place after the last line of the CSV- file has been processed, and no atomic insertion is made. Any ideas? Thank you!
  4. I'm processing a CSV- file and I want to create Database entries into different tables from these entries. Per CSV- line there should be exactly ONE database transaction. Problem is now, that I tried to use tOracleOutput components connected to the tMap- component. Now, the transformation will never stop (no error, no exit of the program). I tried to debug the code and saw, that the jobs "hangs" when trying to execute THIS stament: insertedCount_tOracleOutput_2 = insertedCount_tOracleOutput_2 + pstmt_tOracleOutput_2.executeUpdate(); which is part of the secont tOracleOutput - component - see attached graph. If I use tOracleConnectiona and tOracleOutput the INSERT works but than the commit would take place after the last line of the CSV- file has been processed, and no atomic insertion is made. Any ideas? Thank you!
  5. Hello! I have a problem to understand the database transaction management in JasperETL. What I need is a transaction for every sentence read from a CSV- file. Until now I used tOracleConnect and tOracleCommit/Rollback but I just read that this results in a commit (or rollback) including all changes made in the whole job. This would be bad if I have thousands of lines to process. Which is "The better way"? Problem is If I uncheck the "Use an existing connection" within the Oracle DB - components (tOracleInput, tOracleOutput etc) I have to insert all connection data within EVERY component. Do you have any suggestions for me? Tank you in advance!
  6. Hello! I've a new challlenge using JasperETL. I've got a csv- file coming from an old host system. Sometimes sentences are continued in the next line, which is indicated by a number greater then one. Example: id;continued;text 1001;1;"some text"; 1002;1;"this text..."; 1002;2;"...is continued here"; In this example the second argument is the flag which indicates if the actual sentence belongs to the predecessor : for the id "1002" there does exist a continued sentence which ismindicated by the "2" in the field "continued" (last sentence). So what I want now is to process the two lines for the id "1002" as ONE data row. Is there any easy possibillity to do this? Thank you for every suggestion in advance! Stefan
  7. Hello! I've a new challlenge using JasperETL. I've got a csv- file coming from an old host system. Sometimes sentences are continued in the next line, which is indicated by a number greater then one. Example: id;continued;text 1001;1;"some text"; 1002;1;"this text..."; 1002;2;"...is continued here"; In this example the second argument is the flag which indicates if the actual sentence belongs to the predecessor : for the id "1002" there does exist a continued sentence which ismindicated by the "2" in the field "continued" (last sentence). So what I want now is to process the two lines for the id "1002" as ONE data row. Is there any easy possibillity to do this? Thank you for every suggestion in advance! Stefan
  8. Is it possible that the SchemaComplianceCheck has a bug? I work with a positional file as input, and according to the documentation the component should check for null or empty , so I thought that the string "" should result in a rejected line when the "nullable" checkbox is not set, but the java code generated says if (checkSchema.svnr == null) { } so empty strings are allowed. Am I right and if I am, how would a workaround look like? Thanks in advance!
  9. Hello! I have the following situation: I have a positional file and a databse table and join them togehter (file is main, db table is lookup) using the inner join option. It's clear, that in this constellation there are only arriving sentences in the output where there exists data in both, the db table AND the file. But now I need to know which data records have been skipped out from the postional file because they where not able to be joined for logging purposes (because this is a data error of the input file). Is there any possibillity to do this? Thanks in adavnce for your answers, Stefan Mackovik
  10. Hello! I have the following situation: I have a positional file and a databse table and join them togehter (file is main, db table is lookup) using the inner join option. It's clear, that in this constellation there are only arriving sentences in the output where there exists data in both, the db table AND the file. But now I need to know which data records have been skipped out from the postional file because they where not able to be joined for logging purposes (because this is a data error of the input file). Is there any possibillity to do this? Thanks in adavnce for your answers, Stefan Mackovik
  11. Hello! I wonder how to use the tLibraryLoad- Component? If I'm right it's for using external libraries (e.g. .jar - Files) within my JasperETL projects. But the first problem is: how do I integrate the component into my job process? It works when it is the first element of the job and when I connect it with the event type "OnSubjobOK" with the next (input) component, but is this right? Second, is thee a way to import multiple jar - files within ONE tLibrary- component or do I have to cascade tLibrary- components for each library? Third problem is: when I try to instantiate and make useage of an imported class in a tJavaRow- component it inserts code fragments like Code:" String, Object>(); public void myFunction(){ if( " into the job script ( I saw this in the code view), and as the code- view is read-only I'm not able to delete this code (from where does it come anyway?) Also there is no import - stement generated in the head of the java- class, so I wouldn't be able to use it. Can anyone tell me please how to make tLibraryLoad work? Thank you in advance, Stefan
  12. We are replacing an old host application with a new JEE- Application. To get data from other systems into the application there are existing textfiles, which are imported in our application. These interfaces should not be changed. There is one file per inerface, within one and the same file there are different kinds of plain text files: Every kind of dataset is defined within the first 4 characters of the line: For example "CUST" tells us that the line is the definition of the customer, which should be updated. This dataset is followed by a one or more sets with "ADDR" which indicates that this line contains address - information to the customer. Then the next customer follows in another "CUST" line. So now let's imagine that customers exist in a database and that I have to create or update the address- entries for these customers. How do I have to do this with JasperETL? Problem is that there are no fixed positions which are valid for the whole file but only for a dataset type so I'm not able to use the FileInputPositional- compontent available in JasperETL. Example: CUSTJohn Doe ADDR12345London Street 4 Thanks for your help! Stefan
  13. Because maybe the explantion of my problem is not exact enough, here a more precise one: I do have a file with different kinds of datasets in it. Every kind of dataset is defined within the first 4 characters of the line: For example "CUST" tells us that the line is the definition of the customer, which should be updated. This dataset is followed by a one or more sets with "ADDR" which indicates that this line contains address - information to the customer. Then the next customer follow in another "CUST" line. So now let's imagine that customers exist in a database and that I have to create or update the adress- entries for these customers. How do I have to do this with JasperETL? Problem is that there are fixed postions which contain the required information, but they are not fixed for the whole file but only for the specific dataset- formats "CUST" and "ADDR". Example: CUSTJohn Doe ADDR12345London Street 4 Thanks for your help! Stefan
  14. Hi all! For a new project we have to handle special input files: we get one big filestream from an external system. This stream consists of data of different kinds for different various purposes. Each block of effective data (which consits of n datasets) is surronded by a packet prefix- dataset, a fore-dataset and a postfix-dataset as well as a packet end- dataset. Between these are the important sentences which are holding the data. Structure: ** packet fore- dataset projId:X ** * fore-dataset (some info) * dataset-type1 ("real data") dataset-type2 ("real data") dataset-type2 ("real data") dataset-type1 ("real data") dataset-type2 ("real data") * fore-dataset (some info) end forProjX * ** packet end- dataset projId:X ** ** packet fore- dataset projId:Y ** [...] The surrounding sentences are used to identify the following data structure and are holding some other information which should be used in the ETL-process. So they need to be processed, to. Also, there can be different kinds of setnences of real data. All this data is held in a positional text-file, problem now is that from dataset-type to dataset-type the positions are differing, so I guess am not able to use tFileInputPositional. Any ideas how to handle this problem within JasperETL? Thank you in advance for your answers! Stefan Mackovik
  15. This is also an important requirement for our projects. At the moment we're using self-written java- batch-jobs, which generate errorlogs (for technical errors) and protocol files (so anyone can look if these cases have really been written into the database, and for those rows which couldn't be written because of lacking business requirements). In some case 1..n log files would be required. Thanks in advance for your help.
×
×
  • Create New...