Executing Data Pump with /w parameter makes it essentially an automated data uploader. It supports SHP, MID/MIF and TAB types of datasets. It is enough to just drop a set of files (in the uncompressed form) into the folder that is being watched by Data Pump, and in a matter of minutes these datasets will be uploaded into the database.
- Watcher supports 3 types of datasets. Each type comprises of a set of files. In every case all files are required for a dataset to be uploaded.
- ESRI Shapefile – .shp, .dbf, .prj, .shx
- MapInfo TAB – .tab, .dat, .map, .id
- MapInfo MID – .mid, .mif
To set Data Pump as a watchdog, please run it with following parameters:
DataPump.exe /w pathToWatcherCard
where pathToWatcherCard is the location of Watcher configuration file.
If no configuration file is found, a template will be created in its place. The template will be named watcher.xml.template.
In watcher.xml you have to specify 7 parameters:
|WatchedFolder||path to the watched folder|
|DbConnectionString||database connection string from Earthlight’s web.config (usually from Main Repository)|
|TableMappingFilePath||location of the table names mapping file: mapping.txt|
|ReportsLocation||location of the folder where temporary import scripts and reports will be written|
|LeaveDatasetsAfterImport||can be set to true/false and will control Watcher behaviour after import|
|IncludeSubfolders||can be set to true/false and will control how Watcher behaves when the watched folder has subfolders|
|PreserveFolderStructure||can be set to true/false and will control if folder name will be added to the name of the dataset to create unique table name in the database. Requires IncludeSubfolders to be true|
In mapping.txt you may specify explicitly the database table name to be used for a specific dataset. To do so write the name of the database table and the name of the dataset (without extension) separated by a pipe ( | ) symbol. Please see the example below for reference.
TABLE_002|All recycling sites
An example of watcher.xml file is provided below. As you can see Watcher will be uploading datasets from the root and all subfolders of the watched folder and it will leave the datasets after the import.
<?xml version="1.0" encoding="utf-8"?>
<DbConnectionString>database=statmap;user id=USER;password=PASSWORD;timeout=15;pooling=True;enlist=False;integrated security=False;initial catalog=EarthlightDB;data source=dbserver\sqlexpress;cartridge=SqlServer;schema=dbo</DbConnectionString>
Upload scripts and reports are deleted after each successful upload.
In case of an error occurring in an import procedure an error file is created in the folder where the dataset is stored. The error file is a text file which name is made by adding !ERROR_ prefix to the name of the dataset. The report contains all the information that DataPump was able to collect about the problem.
As you can see in the screenshot above two datasets have failed to import. If you want to attempt importing these particular datasets again, you have to delete the !ERROR_… files. Until these files reside in the folder the datasets will be ignored in any future imports.