Import/Export large amount of data

Moderator: crythias

Locked
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Import/Export large amount of data

Post by dtamajon »

Hi again!

I have created a bash script to import data to the CMDB periodically through a cron. The job is working fine, while the CSV to import is quite small (less than 200 rows).

I have configured the template to accept 1000 rows, but the process shows errors afte processing about 200 rows. The message I get for every row not imported is:

Message: Can't write '/opt/otrs/var/tmp/CacheFileStorable/XML/63183a03ff12b1d42191240255c4dd9d': Permission denied

I have cleaned such directory, and then is able to import again some more rows, but as soon as it arrives to 200 new rows it breaks again.

I don't know why, at first time I found about 200 files, then I cleaned and run again, and now I can find 400 files and my CMDB cointains the 200 previous items plus 200 new items from second execution... it is not sense for me :(

Is it any limit on processing import data files when the most of items are new to OTRS? Do you have any idea about this problem?

Thanks again!!
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

For more info, the otrs account is in the www-data group.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

I resolved it... some folders didn't have the correct permission... I have executed SetPermissions again and now it's correct... maybe I missed or mistaken some parameter before.

Now my problem is the input file, which is an export from another Postgre database:

Message: ImportError at line 1615, ErrorCode: 2012 'EOF - End of data in parsing input stream'

I'm analysing data... something is incorrect in file
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

I can't solve the problem. I have reviewed the file to import and everything is correct. I have tested to export from OTRS and import the same file, and it returns the same error:

Code: Select all

Import in process...
ERROR: OTRS-ImportExport-10 Perl: 5.10.1 OS: linux Time: Sun Sep 16 19:27:47 2012

 Message: ImportError at line 1615, ErrorCode: 2012 'EOF - End of data in parsing input stream'

 Traceback (17134):
   Module: Kernel::System::ImportExport::FormatBackend::CSV::ImportDataGet (v1.29) Line: 325
   Module: Kernel::System::ImportExport::Import (v1.41) Line: 2188
   Module: /opt/otrs/bin/otrs.ImportExport.pl (v1.2) Line: 128
My Import/Export configuration is:
  • Object: ITSMConfigItem
  • Format: CSV
  • Class: Hardware
  • Maximum number of one element: 1
  • Empty fields indicate that the current values are kept: true
  • Column Separator: ;
  • Charset: UTF-8
  • Include Column Headers: No
  • Columns: [MyId, identifier], [Name], [Vendor], [Model], [SerialNumber], [SiteLocation], [Topology], [MacAddress], [Deployment State], [Incident State]
He field "MyId" is a custom field configured in the Config Imtes Hadware class as:

Code: Select all

         { 
        Key => 'MyId', 
        Name => 'MyId', 
        Searchable => 1, 
        Input => { 
            Type => 'Text', 
            Size => 10, 
            MaxLength => 10, 
        }, 
Do you have any idea on why such problem? I have been searching for similar issues, with no luck with solutions...

Thanks!!
Last edited by dtamajon on 19 Sep 2012, 12:42, edited 4 times in total.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
crythias
Moderator
Posts: 10170
Joined: 04 May 2010, 18:38
Znuny Version: 5.0.x
Location: SouthWest Florida, USA
Contact:

Re: Import/Export large amount of data

Post by crythias »

ImportError at line 1615, ErrorCode: 2012 'EOF - End of data in parsing input stream'
Seems like a good place to look at your data.
OTRS 6.0.x (private/testing/public) on Linux with MySQL database.
Please edit your signature to include your OTRS version, Operating System, and database type.
Click Subscribe Topic below to get notifications. Consider amending your topic title to include [SOLVED] if it is so.
Need help? Before you ask
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

I just did, and watched the hidden characters, and I have seen nothing strange. That line, is the last line in file.

I tried to add and extra LF, but neither works.

When using the web GUI, I see on the log that the self-exported file has been correctly imported... but when I do the same operation from command line, it's returning that message.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

I have created a file with just one row, and the problem is the same.

"10249";"ZUBH98228108";;;;"My Loc 2";;"0B103T54XD32";"Production";"Operational"

But now, I get a message with items processed

Code: Select all

Import in process...
ERROR: OTRS-ImportExport-10 Perl: 5.10.1 OS: linux Time: Mon Sep 17 09:08:47 2012

 Message: ImportError at line 2, ErrorCode: 2012 'EOF - End of data in parsing input stream'

 Traceback (21383):
   Module: Kernel::System::ImportExport::FormatBackend::CSV::ImportDataGet (v1.29) Line: 325
   Module: Kernel::System::ImportExport::Import (v1.41) Line: 2188
   Module: /opt/otrs/bin/otrs.ImportExport.pl (v1.2) Line: 128


Import of 1 ITSMConfigItem records: 0 failed, 1 succeeded
Import of 1 ITSMConfigItem records: 1 Skipped
The skipped is correct because row just exists in my CMDB... but still don't know why the error message
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
crythias
Moderator
Posts: 10170
Joined: 04 May 2010, 18:38
Znuny Version: 5.0.x
Location: SouthWest Florida, USA
Contact:

Re: Import/Export large amount of data

Post by crythias »

What version of otrs?
OTRS 6.0.x (private/testing/public) on Linux with MySQL database.
Please edit your signature to include your OTRS version, Operating System, and database type.
Click Subscribe Topic below to get notifications. Consider amending your topic title to include [SOLVED] if it is so.
Need help? Before you ask
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

Versions installed are:

otrs-3.1.10
ITSM-3.1.6


For more info:

Linux Ubuntu 10.04 TLS
PostgreSQL 9.1
Perl 5.10.1
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

I have checked that if I don't quit the process, all data is imported to database, but in the console remains the error and looks like it is not finished, not returning control to the system... I need to end it manually, so I'm afraid the cron threads stay after the execution is finished.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

After some investigation and learning some Perl, I have modified the following code on CSV.pm file (module Kernel::System::ImportExport::FormatBackend::CSV)

Code: Select all

    while ( my $Column = $ParseObject->getline($FH) ) {
        push @ImportData, $Column;
        $LineCount++;
    }
For this code, where eof is tested...

Code: Select all

    until ( eof($FH) ) {
        my $Column = $ParseObject->getline($FH);
        push @ImportData, $Column;
        $LineCount++;
    }

I think the problem is the use of "getline", as explained on Text::CSV_XS documentation:
2012 "EOF - End of data in parsing input stream"
Self-explaining. End-of-file while inside parsing a stream. Can only happen when reading from streams with "getline", as using "parse" is done on strings that are not required to have a trailing eol.
I have checked on my different environments (Linux / Windows) and it's working fine (or at least, with no eof error messages).

Could anyone confirm I'm right?
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
crythias
Moderator
Posts: 10170
Joined: 04 May 2010, 18:38
Znuny Version: 5.0.x
Location: SouthWest Florida, USA
Contact:

Re: Import/Export large amount of data

Post by crythias »

Te EOF error should generally be cosmetic as it should only occur at the End of File
dtamajon wrote:[MyId, identifier], [Name], [Vendor], [Model], [SerialNumber], [SiteLocation], [Topology], [MacAddress], [Deployment State], [Incident State]
was [MyId, identifier] a miscopy or is that exactly as it shows?

"10249";"ZUBH98228108";;;;"My Loc 2";;"0B103T54XD32";"Production";"Operational"
"10249";"ZUBH98228108";name;vendor;model;serial:"My Loc 2";sitelocation;topology"0B103T54XD32";macadress"Production";deploymentstate "Operational"

You're short a column.
OTRS 6.0.x (private/testing/public) on Linux with MySQL database.
Please edit your signature to include your OTRS version, Operating System, and database type.
Click Subscribe Topic below to get notifications. Consider amending your topic title to include [SOLVED] if it is so.
Need help? Before you ask
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

With [MyId, identifier] I mean the field [MyId] is marked as an identifier.

I suppose you accounted "identifier" as a column, which is not.

Anyway, it should work if I do "export" from GUI and take the same file and "import" from command line, but not always runs ok.

I have seen, the error appears only on certain environments, but I don't know the cause ... data I'm using is exactly the same on all environments I have tested.

If have some time I will investigate further more to search for differences between my environments.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
crythias
Moderator
Posts: 10170
Joined: 04 May 2010, 18:38
Znuny Version: 5.0.x
Location: SouthWest Florida, USA
Contact:

Re: Import/Export large amount of data

Post by crythias »

dtamajon wrote:the error appears only on certain environments
Like what? Any chance you're including semicolons un wrapped in apostrophes/quotes or something?
In my experience, import failures of this type *usually* tend to be a problem with column counts.

Can you post the last line in your file?
OTRS 6.0.x (private/testing/public) on Linux with MySQL database.
Please edit your signature to include your OTRS version, Operating System, and database type.
Click Subscribe Topic below to get notifications. Consider amending your topic title to include [SOLVED] if it is so.
Need help? Before you ask
jeske
Znuny newbie
Posts: 10
Joined: 10 Jan 2012, 17:44
Znuny Version: 3.1.2
Real Name: jeff
Company: Creighton University
Location: Creighton University, Omaha, NE
Contact:

Re: Import/Export large amount of data

Post by jeske »

I've just run into the same problem. The way that I solved the "EOF - End of data in parsing input stream" error was to remove the Text::CSV_XS perl module. From what I can tell, it may be due the way that it reads data in from the input string or something. Not sure exactly, but when I deleted that module, the error went away.

Good luck.

Jeff
OTRS 3.1 on Linux with MySQL database connected to an Active Directory for Agents and Customers.
dtamajon
Znuny newbie
Posts: 15
Joined: 21 Aug 2012, 19:47
Znuny Version: 3.6.0
Real Name: Daniel
Company: Lite Solutions

Re: Import/Export large amount of data

Post by dtamajon »

My 1 row file, which fails too, is:

"10249";"ZUBH98228108";;;;"My Loc 2";;"0B103T54XD32";"Production";"Operational"

I'm not having any error with the Perl modification I did, and I have all data up to date. I have checked daily logs with no errors, so the EOF test solved my problem.
Linux Ubuntu 10.04 TLS / PostgreSQL 9.1 / Perl 5.10.1
OTRS 3.1.10 / ITSM 3.1.6
Wolfgangf
Znuny ninja
Posts: 1029
Joined: 13 Apr 2009, 12:26
Znuny Version: 6.0.13
Real Name: Wolfgang Fürtbauer
Company: PBS Logitek GmbH
Location: Pinsdorf

Re: Import/Export large amount of data

Post by Wolfgangf »

Hip-Hip horray! I was looking for a long time to solve this error
Has somebody opened a bug for this at bugs.otrs.org?
Produktiv:
OTRS 6.0.13/ ITSM 6.0.13
OS: SUSE Linux (SLES 12, Leap), MySql 5.5.x, 5.6.x
Windows 2012 AD Integration (agents and customers), Nagios integration (incidents, CMDB), Survey, TimeAccounting
Locked