Legacy Data

Previous Topic Next Topic
 
classic Classic list List threaded Threaded
11 messages Options
Reply | Threaded
Open this post in threaded view
|

Legacy Data

james-2
Banned User
This post was updated on .
CONTENTS DELETED
The author has deleted this message.
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Jacques Le Roux
Administrator
Hi James,

Never used that but we have

https://cwiki.apache.org/confluence/display/OFBIZ/Import+Data+Using+Apache+POI+api

https://cwiki.apache.org/confluence/display/OFBIZ/Handling+of+External+data

Look for Excel at https://cwiki.apache.org/confluence/display/OFBIZ/FAQ+-+Tips+-+Tricks+-+Cookbook+-+HowTo

Jacques

Le 01/05/2019 à 18:22, [hidden email] a écrit :
> We are trying to upload legacy customers into ofbiz using the data
> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
> file but the "not processed" field still shows 0.  Can someone send me
> the excel template for uploading?
>
> Thanks,
>
> James
>
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Pierre Smits-3
In reply to this post by james-2
Hi James,

How much legacy data are you talking about importing? It might be more
prudent to use an ETL solution to do that.

Best regards,

Pierre Smits

*Apache Trafodion <https://trafodion.apache.org>, Vice President*
*Apache Directory <https://directory.apache.org>, PMC Member*
Apache Incubator <https://incubator.apache.org>, committer
*Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges)
since 2008*
Apache Steve <https://steve.apache.org>, committer


On Wed, May 1, 2019 at 6:22 PM <[hidden email]> wrote:

> We are trying to upload legacy customers into ofbiz using the data
> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
> file but the "not processed" field still shows 0.  Can someone send me
> the excel template for uploading?
>
> Thanks,
>
> James
>
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Michael Brohl-3
In reply to this post by james-2
You can also easily generate load data by Excel itself:

- create a new sheet and insert a formula concatenating the static parts
with the dynamic parts/cells of the data sheet (see example).

- apply the formula to each row by dragging the lower right corner of
the cell containing the formula for the number of rows containing data

- copy the first column and paste into the load data xml file.

Example:

=CONCATENATE(
"<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
)

Regards,

Michael Brohl

ecomify GmbH - www.ecomify.de


Am 01.05.19 um 18:22 schrieb [hidden email]:
> We are trying to upload legacy customers into ofbiz using the data
> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
> file but the "not processed" field still shows 0.  Can someone send me
> the excel template for uploading?
>
> Thanks,
>
> James


smime.p7s (5K) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

marcopaul
Using Excel XML features:
-In Web tools:
        -Export the entity you want to work with, one at a time
        - open it in Excel Windows
                - Last I checked, Mac excel XML support sucks. But Mac Excel sucks in general.  
        - Excel will figure out the schema and embed it so you can save it as XML from there.

I’m going off of memory here as I haven’t done it in a while but I’ve found at the time it was the easiest way to directly load/update data in the entity engine outside of directly writing data.

It’s possible, but not sure if it’s advisable, to directly load data into the underlying database via an ETL tool like Pentaho BI. Anyone know the risks of doing this? I haven’t had time to research the implications.

—P


> On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]> wrote:
>
> You can also easily generate load data by Excel itself:
>
> - create a new sheet and insert a formula concatenating the static parts with the dynamic parts/cells of the data sheet (see example).
>
> - apply the formula to each row by dragging the lower right corner of the cell containing the formula for the number of rows containing data
>
> - copy the first column and paste into the load data xml file.
>
> Example:
>
> =CONCATENATE(
> "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"' externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
> )
>
> Regards,
>
> Michael Brohl
>
> ecomify GmbH - www.ecomify.de
>
>
> Am 01.05.19 um 18:22 schrieb [hidden email]:
>> We are trying to upload legacy customers into ofbiz using the data
>> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
>> file but the "not processed" field still shows 0.  Can someone send me
>> the excel template for uploading?
>>
>> Thanks,
>>
>> James
>

Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Pierre Smits-3
In the recent past, and for several customer projects, I have been involved
in moving quite a lot of data from the DBs of other (legacy) solutions into
others (DWH, etc.) with ETL solutions. One of those projects was from a
legacy system to the OFBiz Dbs (both the transactional and the dwh).

Before I delve deeper into benefits and cost of using such a solution, I
want to remark that, no matter what tool is used (Excel, CSV, etc.) to move
such legacy data into the transactional and dwh databases of OFBiz, such
activities require a lot of planning [1], development [2] and testing [3]
before data can be moved into the production environment.

[1] The entity model of OFBiz is way more complex than some may anticipate
(based on experiences with their legacy solution) as there are a lot of
interlocking dependencies between tables. Careful planning of what needs to
be loaded in what sequence requires in-depth knowledge of the entity model;
[2] Legacy data does/may - in my experience - adhere to different paradigms
regarding field type definitions. That requires - quite some -
transformation of field values before it can be loaded into the OFBiz
database(s),
[3] Before loading into the database(s) in the production environment, the
load part of the ETL processes must be tested in a separate environment to
ensure that - at the moment of loading into production - the disruption of
operational activities (of users and/or OFBiz services) is minimised as the
risk of such (and the potential rebuild of the production environment) is
quite costly.

Using an ETL solution to migrate date delivers several benefits:

   1. Extracting data can be done against various sources, whether that be
   from a (live) database, Excel sheets, various data files in CSV, JSON
   and/or XML format,
   2. Transforming data values, as well as combining data from different
   sources can be done (developed) in various ways, and tested step by step,
   3. Changing where the ETL solution is going to load the data into is
   fast, and this data can go into various formats: database, Excel sheets,
   various data files in CSV, JSON and/or XML format,
   4. The whole of ETL processes can be executed both on-demand and to
   off-business hours (scheduled) to cater for both a) need data now and b)
   minimisation of operational disruptions.

The key thing to keep in mind is that migrating legacy data into OFBiz
databases does NOT mean that it is being processed by the OFBiz engine
automatically. The OFBiz engine has a lot of services that are triggered by
other services (consider the gl transaction creation on the state change of
an invoice or payment). Loading the legacy data into the transaction db of
OFBiz implies that such state changes needs to happen on these kinds of
parent objects to get to the desired end result.

Another thing to keep in mind is that OFBiz is - for most - intended as a
transactional solution. With legacy data one should distinguish between
what is needed for operations and what is intended for historical insights
through the business intelligence. Often, if not always, a lot of the
legacy data is just intended for that, and one should consider to aggregate
(some of) the data into the OFBiz clap database.

If anyone wants more insights and/or assistance, feel free to contact me
directly.

Best regards,

Pierre Smits

*Apache Trafodion <https://trafodion.apache.org>, Vice President*
*Apache Directory <https://directory.apache.org>, PMC Member*
Apache Incubator <https://incubator.apache.org>, committer
*Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges)
since 2008*
Apache Steve <https://steve.apache.org>, committer


On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[hidden email]> wrote:

> Using Excel XML features:
> -In Web tools:
>         -Export the entity you want to work with, one at a time
>         - open it in Excel Windows
>                 - Last I checked, Mac excel XML support sucks. But Mac
> Excel sucks in general.
>         - Excel will figure out the schema and embed it so you can save it
> as XML from there.
>
> I’m going off of memory here as I haven’t done it in a while but I’ve
> found at the time it was the easiest way to directly load/update data in
> the entity engine outside of directly writing data.
>
> It’s possible, but not sure if it’s advisable, to directly load data into
> the underlying database via an ETL tool like Pentaho BI. Anyone know the
> risks of doing this? I haven’t had time to research the implications.
>
> —P
>
>
> > On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]>
> wrote:
> >
> > You can also easily generate load data by Excel itself:
> >
> > - create a new sheet and insert a formula concatenating the static parts
> with the dynamic parts/cells of the data sheet (see example).
> >
> > - apply the formula to each row by dragging the lower right corner of
> the cell containing the formula for the number of rows containing data
> >
> > - copy the first column and paste into the load data xml file.
> >
> > Example:
> >
> > =CONCATENATE(
> > "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
> externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
> > )
> >
> > Regards,
> >
> > Michael Brohl
> >
> > ecomify GmbH - www.ecomify.de
> >
> >
> > Am 01.05.19 um 18:22 schrieb [hidden email]:
> >> We are trying to upload legacy customers into ofbiz using the data
> >> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
> >> file but the "not processed" field still shows 0.  Can someone send me
> >> the excel template for uploading?
> >>
> >> Thanks,
> >>
> >> James
> >
>
>
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Jacques Le Roux
Administrator
Hi Pierre,

Maybe you could enrich the related wiki pages or create a new one using your message below?

Thanks

Jacques

Le 03/05/2019 à 10:21, Pierre Smits a écrit :

> In the recent past, and for several customer projects, I have been involved
> in moving quite a lot of data from the DBs of other (legacy) solutions into
> others (DWH, etc.) with ETL solutions. One of those projects was from a
> legacy system to the OFBiz Dbs (both the transactional and the dwh).
>
> Before I delve deeper into benefits and cost of using such a solution, I
> want to remark that, no matter what tool is used (Excel, CSV, etc.) to move
> such legacy data into the transactional and dwh databases of OFBiz, such
> activities require a lot of planning [1], development [2] and testing [3]
> before data can be moved into the production environment.
>
> [1] The entity model of OFBiz is way more complex than some may anticipate
> (based on experiences with their legacy solution) as there are a lot of
> interlocking dependencies between tables. Careful planning of what needs to
> be loaded in what sequence requires in-depth knowledge of the entity model;
> [2] Legacy data does/may - in my experience - adhere to different paradigms
> regarding field type definitions. That requires - quite some -
> transformation of field values before it can be loaded into the OFBiz
> database(s),
> [3] Before loading into the database(s) in the production environment, the
> load part of the ETL processes must be tested in a separate environment to
> ensure that - at the moment of loading into production - the disruption of
> operational activities (of users and/or OFBiz services) is minimised as the
> risk of such (and the potential rebuild of the production environment) is
> quite costly.
>
> Using an ETL solution to migrate date delivers several benefits:
>
>     1. Extracting data can be done against various sources, whether that be
>     from a (live) database, Excel sheets, various data files in CSV, JSON
>     and/or XML format,
>     2. Transforming data values, as well as combining data from different
>     sources can be done (developed) in various ways, and tested step by step,
>     3. Changing where the ETL solution is going to load the data into is
>     fast, and this data can go into various formats: database, Excel sheets,
>     various data files in CSV, JSON and/or XML format,
>     4. The whole of ETL processes can be executed both on-demand and to
>     off-business hours (scheduled) to cater for both a) need data now and b)
>     minimisation of operational disruptions.
>
> The key thing to keep in mind is that migrating legacy data into OFBiz
> databases does NOT mean that it is being processed by the OFBiz engine
> automatically. The OFBiz engine has a lot of services that are triggered by
> other services (consider the gl transaction creation on the state change of
> an invoice or payment). Loading the legacy data into the transaction db of
> OFBiz implies that such state changes needs to happen on these kinds of
> parent objects to get to the desired end result.
>
> Another thing to keep in mind is that OFBiz is - for most - intended as a
> transactional solution. With legacy data one should distinguish between
> what is needed for operations and what is intended for historical insights
> through the business intelligence. Often, if not always, a lot of the
> legacy data is just intended for that, and one should consider to aggregate
> (some of) the data into the OFBiz clap database.
>
> If anyone wants more insights and/or assistance, feel free to contact me
> directly.
>
> Best regards,
>
> Pierre Smits
>
> *Apache Trafodion <https://trafodion.apache.org>, Vice President*
> *Apache Directory <https://directory.apache.org>, PMC Member*
> Apache Incubator <https://incubator.apache.org>, committer
> *Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges)
> since 2008*
> Apache Steve <https://steve.apache.org>, committer
>
>
> On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[hidden email]> wrote:
>
>> Using Excel XML features:
>> -In Web tools:
>>          -Export the entity you want to work with, one at a time
>>          - open it in Excel Windows
>>                  - Last I checked, Mac excel XML support sucks. But Mac
>> Excel sucks in general.
>>          - Excel will figure out the schema and embed it so you can save it
>> as XML from there.
>>
>> I’m going off of memory here as I haven’t done it in a while but I’ve
>> found at the time it was the easiest way to directly load/update data in
>> the entity engine outside of directly writing data.
>>
>> It’s possible, but not sure if it’s advisable, to directly load data into
>> the underlying database via an ETL tool like Pentaho BI. Anyone know the
>> risks of doing this? I haven’t had time to research the implications.
>>
>> —P
>>
>>
>>> On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]>
>> wrote:
>>> You can also easily generate load data by Excel itself:
>>>
>>> - create a new sheet and insert a formula concatenating the static parts
>> with the dynamic parts/cells of the data sheet (see example).
>>> - apply the formula to each row by dragging the lower right corner of
>> the cell containing the formula for the number of rows containing data
>>> - copy the first column and paste into the load data xml file.
>>>
>>> Example:
>>>
>>> =CONCATENATE(
>>> "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
>> externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
>>> )
>>>
>>> Regards,
>>>
>>> Michael Brohl
>>>
>>> ecomify GmbH - www.ecomify.de
>>>
>>>
>>> Am 01.05.19 um 18:22 schrieb [hidden email]:
>>>> We are trying to upload legacy customers into ofbiz using the data
>>>> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
>>>> file but the "not processed" field still shows 0.  Can someone send me
>>>> the excel template for uploading?
>>>>
>>>> Thanks,
>>>>
>>>> James
>>
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

Pierre Smits-3
Thank you, Jacques, for the suggestion. I will take that into consideration.

Best regards,

Pierre Smits

*Apache Trafodion <https://trafodion.apache.org>, Vice President*
*Apache Directory <https://directory.apache.org>, PMC Member*
Apache Incubator <https://incubator.apache.org>, committer
*Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges)
since 2008*
Apache Steve <https://steve.apache.org>, committer


On Fri, May 3, 2019 at 11:10 AM Jacques Le Roux <
[hidden email]> wrote:

> Hi Pierre,
>
> Maybe you could enrich the related wiki pages or create a new one using
> your message below?
>
> Thanks
>
> Jacques
>
> Le 03/05/2019 à 10:21, Pierre Smits a écrit :
> > In the recent past, and for several customer projects, I have been
> involved
> > in moving quite a lot of data from the DBs of other (legacy) solutions
> into
> > others (DWH, etc.) with ETL solutions. One of those projects was from a
> > legacy system to the OFBiz Dbs (both the transactional and the dwh).
> >
> > Before I delve deeper into benefits and cost of using such a solution, I
> > want to remark that, no matter what tool is used (Excel, CSV, etc.) to
> move
> > such legacy data into the transactional and dwh databases of OFBiz, such
> > activities require a lot of planning [1], development [2] and testing [3]
> > before data can be moved into the production environment.
> >
> > [1] The entity model of OFBiz is way more complex than some may
> anticipate
> > (based on experiences with their legacy solution) as there are a lot of
> > interlocking dependencies between tables. Careful planning of what needs
> to
> > be loaded in what sequence requires in-depth knowledge of the entity
> model;
> > [2] Legacy data does/may - in my experience - adhere to different
> paradigms
> > regarding field type definitions. That requires - quite some -
> > transformation of field values before it can be loaded into the OFBiz
> > database(s),
> > [3] Before loading into the database(s) in the production environment,
> the
> > load part of the ETL processes must be tested in a separate environment
> to
> > ensure that - at the moment of loading into production - the disruption
> of
> > operational activities (of users and/or OFBiz services) is minimised as
> the
> > risk of such (and the potential rebuild of the production environment) is
> > quite costly.
> >
> > Using an ETL solution to migrate date delivers several benefits:
> >
> >     1. Extracting data can be done against various sources, whether that
> be
> >     from a (live) database, Excel sheets, various data files in CSV, JSON
> >     and/or XML format,
> >     2. Transforming data values, as well as combining data from different
> >     sources can be done (developed) in various ways, and tested step by
> step,
> >     3. Changing where the ETL solution is going to load the data into is
> >     fast, and this data can go into various formats: database, Excel
> sheets,
> >     various data files in CSV, JSON and/or XML format,
> >     4. The whole of ETL processes can be executed both on-demand and to
> >     off-business hours (scheduled) to cater for both a) need data now
> and b)
> >     minimisation of operational disruptions.
> >
> > The key thing to keep in mind is that migrating legacy data into OFBiz
> > databases does NOT mean that it is being processed by the OFBiz engine
> > automatically. The OFBiz engine has a lot of services that are triggered
> by
> > other services (consider the gl transaction creation on the state change
> of
> > an invoice or payment). Loading the legacy data into the transaction db
> of
> > OFBiz implies that such state changes needs to happen on these kinds of
> > parent objects to get to the desired end result.
> >
> > Another thing to keep in mind is that OFBiz is - for most - intended as a
> > transactional solution. With legacy data one should distinguish between
> > what is needed for operations and what is intended for historical
> insights
> > through the business intelligence. Often, if not always, a lot of the
> > legacy data is just intended for that, and one should consider to
> aggregate
> > (some of) the data into the OFBiz clap database.
> >
> > If anyone wants more insights and/or assistance, feel free to contact me
> > directly.
> >
> > Best regards,
> >
> > Pierre Smits
> >
> > *Apache Trafodion <https://trafodion.apache.org>, Vice President*
> > *Apache Directory <https://directory.apache.org>, PMC Member*
> > Apache Incubator <https://incubator.apache.org>, committer
> > *Apache OFBiz <https://ofbiz.apache.org>, contributor (without
> privileges)
> > since 2008*
> > Apache Steve <https://steve.apache.org>, committer
> >
> >
> > On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[hidden email]>
> wrote:
> >
> >> Using Excel XML features:
> >> -In Web tools:
> >>          -Export the entity you want to work with, one at a time
> >>          - open it in Excel Windows
> >>                  - Last I checked, Mac excel XML support sucks. But Mac
> >> Excel sucks in general.
> >>          - Excel will figure out the schema and embed it so you can
> save it
> >> as XML from there.
> >>
> >> I’m going off of memory here as I haven’t done it in a while but I’ve
> >> found at the time it was the easiest way to directly load/update data in
> >> the entity engine outside of directly writing data.
> >>
> >> It’s possible, but not sure if it’s advisable, to directly load data
> into
> >> the underlying database via an ETL tool like Pentaho BI. Anyone know the
> >> risks of doing this? I haven’t had time to research the implications.
> >>
> >> —P
> >>
> >>
> >>> On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]>
> >> wrote:
> >>> You can also easily generate load data by Excel itself:
> >>>
> >>> - create a new sheet and insert a formula concatenating the static
> parts
> >> with the dynamic parts/cells of the data sheet (see example).
> >>> - apply the formula to each row by dragging the lower right corner of
> >> the cell containing the formula for the number of rows containing data
> >>> - copy the first column and paste into the load data xml file.
> >>>
> >>> Example:
> >>>
> >>> =CONCATENATE(
> >>> "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
> >> externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
> >>> )
> >>>
> >>> Regards,
> >>>
> >>> Michael Brohl
> >>>
> >>> ecomify GmbH - www.ecomify.de
> >>>
> >>>
> >>> Am 01.05.19 um 18:22 schrieb [hidden email]:
> >>>> We are trying to upload legacy customers into ofbiz using the data
> >>>> import tool.  Is there a spreadsheet template to utilize?  I uploaded
> my
> >>>> file but the "not processed" field still shows 0.  Can someone send me
> >>>> the excel template for uploading?
> >>>>
> >>>> Thanks,
> >>>>
> >>>> James
> >>
>
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

marcopaul
In reply to this post by Pierre Smits-3
Thanks for the insight!
—P

> On May 3, 2019, at 1:21 AM, Pierre Smits <[hidden email]> wrote:
>
> In the recent past, and for several customer projects, I have been involved
> in moving quite a lot of data from the DBs of other (legacy) solutions into
> others (DWH, etc.) with ETL solutions. One of those projects was from a
> legacy system to the OFBiz Dbs (both the transactional and the dwh).
>
> Before I delve deeper into benefits and cost of using such a solution, I
> want to remark that, no matter what tool is used (Excel, CSV, etc.) to move
> such legacy data into the transactional and dwh databases of OFBiz, such
> activities require a lot of planning [1], development [2] and testing [3]
> before data can be moved into the production environment.
>
> [1] The entity model of OFBiz is way more complex than some may anticipate
> (based on experiences with their legacy solution) as there are a lot of
> interlocking dependencies between tables. Careful planning of what needs to
> be loaded in what sequence requires in-depth knowledge of the entity model;
> [2] Legacy data does/may - in my experience - adhere to different paradigms
> regarding field type definitions. That requires - quite some -
> transformation of field values before it can be loaded into the OFBiz
> database(s),
> [3] Before loading into the database(s) in the production environment, the
> load part of the ETL processes must be tested in a separate environment to
> ensure that - at the moment of loading into production - the disruption of
> operational activities (of users and/or OFBiz services) is minimised as the
> risk of such (and the potential rebuild of the production environment) is
> quite costly.
>
> Using an ETL solution to migrate date delivers several benefits:
>
>   1. Extracting data can be done against various sources, whether that be
>   from a (live) database, Excel sheets, various data files in CSV, JSON
>   and/or XML format,
>   2. Transforming data values, as well as combining data from different
>   sources can be done (developed) in various ways, and tested step by step,
>   3. Changing where the ETL solution is going to load the data into is
>   fast, and this data can go into various formats: database, Excel sheets,
>   various data files in CSV, JSON and/or XML format,
>   4. The whole of ETL processes can be executed both on-demand and to
>   off-business hours (scheduled) to cater for both a) need data now and b)
>   minimisation of operational disruptions.
>
> The key thing to keep in mind is that migrating legacy data into OFBiz
> databases does NOT mean that it is being processed by the OFBiz engine
> automatically. The OFBiz engine has a lot of services that are triggered by
> other services (consider the gl transaction creation on the state change of
> an invoice or payment). Loading the legacy data into the transaction db of
> OFBiz implies that such state changes needs to happen on these kinds of
> parent objects to get to the desired end result.
>
> Another thing to keep in mind is that OFBiz is - for most - intended as a
> transactional solution. With legacy data one should distinguish between
> what is needed for operations and what is intended for historical insights
> through the business intelligence. Often, if not always, a lot of the
> legacy data is just intended for that, and one should consider to aggregate
> (some of) the data into the OFBiz clap database.
>
> If anyone wants more insights and/or assistance, feel free to contact me
> directly.
>
> Best regards,
>
> Pierre Smits
>
> *Apache Trafodion <https://trafodion.apache.org>, Vice President*
> *Apache Directory <https://directory.apache.org>, PMC Member*
> Apache Incubator <https://incubator.apache.org>, committer
> *Apache OFBiz <https://ofbiz.apache.org>, contributor (without privileges)
> since 2008*
> Apache Steve <https://steve.apache.org>, committer
>
>
> On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[hidden email]> wrote:
>
>> Using Excel XML features:
>> -In Web tools:
>>        -Export the entity you want to work with, one at a time
>>        - open it in Excel Windows
>>                - Last I checked, Mac excel XML support sucks. But Mac
>> Excel sucks in general.
>>        - Excel will figure out the schema and embed it so you can save it
>> as XML from there.
>>
>> I’m going off of memory here as I haven’t done it in a while but I’ve
>> found at the time it was the easiest way to directly load/update data in
>> the entity engine outside of directly writing data.
>>
>> It’s possible, but not sure if it’s advisable, to directly load data into
>> the underlying database via an ETL tool like Pentaho BI. Anyone know the
>> risks of doing this? I haven’t had time to research the implications.
>>
>> —P
>>
>>
>>> On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]>
>> wrote:
>>>
>>> You can also easily generate load data by Excel itself:
>>>
>>> - create a new sheet and insert a formula concatenating the static parts
>> with the dynamic parts/cells of the data sheet (see example).
>>>
>>> - apply the formula to each row by dragging the lower right corner of
>> the cell containing the formula for the number of rows containing data
>>>
>>> - copy the first column and paste into the load data xml file.
>>>
>>> Example:
>>>
>>> =CONCATENATE(
>>> "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
>> externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
>>> )
>>>
>>> Regards,
>>>
>>> Michael Brohl
>>>
>>> ecomify GmbH - www.ecomify.de
>>>
>>>
>>> Am 01.05.19 um 18:22 schrieb [hidden email]:
>>>> We are trying to upload legacy customers into ofbiz using the data
>>>> import tool.  Is there a spreadsheet template to utilize?  I uploaded my
>>>> file but the "not processed" field still shows 0.  Can someone send me
>>>> the excel template for uploading?
>>>>
>>>> Thanks,
>>>>
>>>> James
>>>
>>
>>

Reply | Threaded
Open this post in threaded view
|

RE: Legacy Data

james-2
Banned User
This post was updated on .
In reply to this post by james-2
CONTENTS DELETED
The author has deleted this message.
Reply | Threaded
Open this post in threaded view
|

Re: Legacy Data

marcopaul
It’s hard to wade through all the MBA powerpoint marketing, but Pentaho Data Integration (Kettle) is the piece you’re interested in. It was completely open-source at some point, I think the core piece still is despite all the branding and push towards buying the whole platform.

It has a drag ’n drop block-based data-flow-ish GUI that is very intuitive once you get past the learning curve. You can implement all common ETL tasks without coding, and everything is extensible otherwise.
 
This youtube shows the basic flow:
https://www.youtube.com/watch?v=28f_qUo4U4c <https://www.youtube.com/watch?v=28f_qUo4U4c>

I’ve gotten it working well at the database level. Haven’t played around with getting it to talk to the ofbiz service engine, which would be killer.  Maybe someone else here has experience there.

—P


> On May 3, 2019, at 10:28 AM, <[hidden email]> <[hidden email]> wrote:
>
> We have used a tool like Talend in the past but are there any open
> source ETL with a graphical UI?
>
> -------- Original Message --------
> Subject: Re: Legacy Data
> From: Pierre Smits <[hidden email]>
> Date: Fri, May 03, 2019 4:10 am
> To: [hidden email]
>
> Thank you, Jacques, for the suggestion. I will take that into
> consideration.
>
> Best regards,
>
> Pierre Smits
>
> *Apache Trafodion <https://trafodion.apache.org>, Vice President*
> *Apache Directory <https://directory.apache.org>, PMC Member*
> Apache Incubator <https://incubator.apache.org>, committer
> *Apache OFBiz <https://ofbiz.apache.org>, contributor (without
> privileges)
> since 2008*
> Apache Steve <https://steve.apache.org>, committer
>
>
> On Fri, May 3, 2019 at 11:10 AM Jacques Le Roux <
> [hidden email]> wrote:
>
>> Hi Pierre,
>>
>> Maybe you could enrich the related wiki pages or create a new one using
>> your message below?
>>
>> Thanks
>>
>> Jacques
>>
>> Le 03/05/2019 à 10:21, Pierre Smits a écrit :
>>> In the recent past, and for several customer projects, I have been
>> involved
>>> in moving quite a lot of data from the DBs of other (legacy) solutions
>> into
>>> others (DWH, etc.) with ETL solutions. One of those projects was from a
>>> legacy system to the OFBiz Dbs (both the transactional and the dwh).
>>>
>>> Before I delve deeper into benefits and cost of using such a solution, I
>>> want to remark that, no matter what tool is used (Excel, CSV, etc.) to
>> move
>>> such legacy data into the transactional and dwh databases of OFBiz, such
>>> activities require a lot of planning [1], development [2] and testing [3]
>>> before data can be moved into the production environment.
>>>
>>> [1] The entity model of OFBiz is way more complex than some may
>> anticipate
>>> (based on experiences with their legacy solution) as there are a lot of
>>> interlocking dependencies between tables. Careful planning of what needs
>> to
>>> be loaded in what sequence requires in-depth knowledge of the entity
>> model;
>>> [2] Legacy data does/may - in my experience - adhere to different
>> paradigms
>>> regarding field type definitions. That requires - quite some -
>>> transformation of field values before it can be loaded into the OFBiz
>>> database(s),
>>> [3] Before loading into the database(s) in the production environment,
>> the
>>> load part of the ETL processes must be tested in a separate environment
>> to
>>> ensure that - at the moment of loading into production - the disruption
>> of
>>> operational activities (of users and/or OFBiz services) is minimised as
>> the
>>> risk of such (and the potential rebuild of the production environment) is
>>> quite costly.
>>>
>>> Using an ETL solution to migrate date delivers several benefits:
>>>
>>> 1. Extracting data can be done against various sources, whether that
>> be
>>> from a (live) database, Excel sheets, various data files in CSV, JSON
>>> and/or XML format,
>>> 2. Transforming data values, as well as combining data from different
>>> sources can be done (developed) in various ways, and tested step by
>> step,
>>> 3. Changing where the ETL solution is going to load the data into is
>>> fast, and this data can go into various formats: database, Excel
>> sheets,
>>> various data files in CSV, JSON and/or XML format,
>>> 4. The whole of ETL processes can be executed both on-demand and to
>>> off-business hours (scheduled) to cater for both a) need data now
>> and b)
>>> minimisation of operational disruptions.
>>>
>>> The key thing to keep in mind is that migrating legacy data into OFBiz
>>> databases does NOT mean that it is being processed by the OFBiz engine
>>> automatically. The OFBiz engine has a lot of services that are triggered
>> by
>>> other services (consider the gl transaction creation on the state change
>> of
>>> an invoice or payment). Loading the legacy data into the transaction db
>> of
>>> OFBiz implies that such state changes needs to happen on these kinds of
>>> parent objects to get to the desired end result.
>>>
>>> Another thing to keep in mind is that OFBiz is - for most - intended as a
>>> transactional solution. With legacy data one should distinguish between
>>> what is needed for operations and what is intended for historical
>> insights
>>> through the business intelligence. Often, if not always, a lot of the
>>> legacy data is just intended for that, and one should consider to
>> aggregate
>>> (some of) the data into the OFBiz clap database.
>>>
>>> If anyone wants more insights and/or assistance, feel free to contact me
>>> directly.
>>>
>>> Best regards,
>>>
>>> Pierre Smits
>>>
>>> *Apache Trafodion <https://trafodion.apache.org>, Vice President*
>>> *Apache Directory <https://directory.apache.org>, PMC Member*
>>> Apache Incubator <https://incubator.apache.org>, committer
>>> *Apache OFBiz <https://ofbiz.apache.org>, contributor (without
>> privileges)
>>> since 2008*
>>> Apache Steve <https://steve.apache.org>, committer
>>>
>>>
>>> On Thu, May 2, 2019 at 5:48 PM Paul Mandeltort <[hidden email]>
>> wrote:
>>>
>>>> Using Excel XML features:
>>>> -In Web tools:
>>>> -Export the entity you want to work with, one at a time
>>>> - open it in Excel Windows
>>>> - Last I checked, Mac excel XML support sucks. But Mac
>>>> Excel sucks in general.
>>>> - Excel will figure out the schema and embed it so you can
>> save it
>>>> as XML from there.
>>>>
>>>> I’m going off of memory here as I haven’t done it in a while but I’ve
>>>> found at the time it was the easiest way to directly load/update data in
>>>> the entity engine outside of directly writing data.
>>>>
>>>> It’s possible, but not sure if it’s advisable, to directly load data
>> into
>>>> the underlying database via an ETL tool like Pentaho BI. Anyone know the
>>>> risks of doing this? I haven’t had time to research the implications.
>>>>
>>>> —P
>>>>
>>>>
>>>>> On May 2, 2019, at 10:12 AM, Michael Brohl <[hidden email]>
>>>> wrote:
>>>>> You can also easily generate load data by Excel itself:
>>>>>
>>>>> - create a new sheet and insert a formula concatenating the static
>> parts
>>>> with the dynamic parts/cells of the data sheet (see example).
>>>>> - apply the formula to each row by dragging the lower right corner of
>>>> the cell containing the formula for the number of rows containing data
>>>>> - copy the first column and paste into the load data xml file.
>>>>>
>>>>> Example:
>>>>>
>>>>> =CONCATENATE(
>>>>> "<Party partyId='";sheet1!A65;"' partyTypeId='";sheet1!B65;"'
>>>> externalId='";sheet1!C65;"' statusId='";sheet1!D65;"' />"
>>>>> )
>>>>>
>>>>> Regards,
>>>>>
>>>>> Michael Brohl
>>>>>
>>>>> ecomify GmbH - www.ecomify.de
>>>>>
>>>>>
>>>>> Am 01.05.19 um 18:22 schrieb [hidden email]:
>>>>>> We are trying to upload legacy customers into ofbiz using the data
>>>>>> import tool. Is there a spreadsheet template to utilize? I uploaded
>> my
>>>>>> file but the "not processed" field still shows 0. Can someone send me
>>>>>> the excel template for uploading?
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> James
>>>>
>>