Web Host Database Bulk Importer
Many small businesses need to mesh database recordsets within their accounting
or inventory control system with corresponding records on their web host.
A number of difficulties arise, depending on the size of the data, the frequency
of updates, and the skills of the website user.
- Importing hundreds of thousands of records requires breaking up the import
file into dozens or even hundreds of files.
- Records that need to be updated frequently (say once a week) in response
to changing circumstances involves a significant amount of labor or owner
distraction
- In many cases the person responsible for database synchronization is an
'operator', with limited understanding of the quirks of record imports to
particular databases
The Web Host Database Bulk Importer system is a customized package for handling
large volumes of data as often as necessary without involving a skilled user.
Customization is required for a number of reasons:
- The source data store may be any of the following: a flat file, an Excel
spreadsheet, a Quickbooks table, an SQL rowset on the local area network, or
another MySQL database on a remote host.
- Different hosts have different capacities or limitations, and in some cases
policies with varying restrictions.
- Users may want the program to run according to a user interface familiar to
them, or restrict access to particular areas of the business.
- Data may be 'unfiltered', in other words raw user input, with strange formatting,
embedded HTML, 'required' fields that are empty, etc. Users may have varying
criteria for how they want unusual case data to be handled.
In its general form, this system has a 'desktop' component and a 'webhost' component.
The 'desktop' component reads data from the appropriate source file(s) and generates (if
necessary) appropriate SQL comamnds to insert or update the records. In some
cases the file may already be a collection of SQL commands. More often, the
input is simply extracted from another database or application.
The 'webhost' component consists of two PHP pages, one which renders an 'input' form which
is filled in by the desktop host. The first page is 'submitted' to the second page,
which extracts the SQL expressions and executes them against the database.
The general idea is to create blocks of 1000 records per submission, fill the first PHP
page with these commands, then submit the 1000 commands to the second page. A single
submission cycle usually completes in less than a minute, allowing even very large sets
of records to import within a hour, often unattended. Such processing runs can be
scheduled to take place during 'off-hours'.
More complicated cases might involve collecting data from multiple data sources, exporting
to multiple hosts, or otherwise processing the information during the upload.
Demonstrations can be arranged on on or two days notice.
|