Skip to content

Commit 17695fb

Browse files
committed
Bug fixes. Refactored of scripts & README content.
1 parent 8b7658d commit 17695fb

21 files changed

Lines changed: 157 additions & 117 deletions

samples/features/sql-big-data-cluster/README.md

Lines changed: 27 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -11,33 +11,46 @@ Installation instructions for SQL Server 2019 big data cluster can be found [her
1111

1212
## Samples Setup
1313

14-
**Before you begin**, download the sample database [backup file](https://sqlchoice.blob.core.windows.net/sqlchoice/static/tpcxbb_1gb.bak) and save it locally. Run the CMD script called *bootstrap-sample-db.cmd* or the shell script *bootstrap-sample-db.sh* depending on your platform. This script will restore the database on the SQL Master instance, execute the *bootstrap-sample-db.sql* script, create the database objects needed, export the web_clickstreams & inventory tables to CSV file, and upload the web_clickstreams CSV file to HDFS inside the SQL Server 2019 big data cluster.
14+
**Before you begin**, run the CMD script called [bootstrap-sample-db.cmd](bootstrap-sample-db.cmd) or the shell script [bootstrap-sample-db.sh](bootstrap-sample-db.sh) depending on your platform. This script does the following operations:
15+
16+
1. Downloads the tpcx-bb 1GB sample database
17+
1. Restores the database on the SQL Master instance
18+
1. Executes the bootstrap-sample-db.SQL script
19+
1. Exports the web_clickstreams, inventory, customer & product_reviews tables to files
20+
1. Uploads the web_clickstreams CSV file to the HDFS inside the SQL Server 2019 big data cluster
1521

1622
__[data-pool](data-pool/)__
1723

24+
SQL Server 2019 big data cluster contains a data pool which consists of many SQL Server instances to store data & query in a scale-out manner.
25+
1826
### Data ingestion using Spark
19-
Connect to the master instance in your SQL Server big data cluster and the SQL Server big data cluster endpoint, and follow the steps in *data-pool/data-ingestion-spark.sql*.
27+
The sample script [data-pool/data-ingestion-spark.sql](data-pool/data-ingestion-spark.sql) shows how to perform data ingestion from Spark into data pool table(s).
2028

2129
### Data ingestion using sql
22-
Connect to the master instance in your SQL Server big data cluster and execute the steps in *data-pool/data-ingestion-sql.sql*.
30+
The sample script [data-pool/data-ingestion-sql.sql](data-pool/data-ingestion-sql.sql) shows how to perform data ingestion from T-SQL into data pool table(s).
2331

2432
__[data-virtualization](data-virtualization/)__
2533

26-
### External table over HDFS
27-
Connect to the master instance in your SQL Server big data cluster and execute the steps in *data-virtualization/external-table-hdfs.sql*.
34+
SQL Server 2019 or SQL Server 2019 big data cluster can use PolyBase external tables to connect to other data sources.
35+
36+
### External table over Storage Pool
37+
SQL Server 2019 big data cluster contains a storage pool consisting of HDFS, Spark and SQL Server instances. The [data-virtualization/storage-pool](data-virtualization/storage-pool) folder contains samples that demonstrate how to query data in HDFS inside SQL Server 2019 big data cluster.
2838

2939
### External table over Oracle
30-
To execute this sample script, you will need following:
31-
1. Oracle instance and credentials
32-
1. Create inventory table in Oracle using [data-virtualization/inventory-oracle.sql](data-virtualization/inventory-oracle.sql/) script
33-
1. Import the inventory.csv file generated by the bootstrap-sample-db script to a table in Oracle
40+
SQL Server 2019 uses new ODBC connectors to enable connectivity to SQL Server, Oracle, Teradata, MongoDB and generic ODBC data sources.
3441

35-
Connect to the master instance in your SQL Server big data cluster and execute the steps in *data-virtualization/external-table-oracle.sql*.
42+
The [data-virtualization/oracle](data-virtualization/oracle) folder contains samples that demonstrate how to query data in Oracle using external tables.
43+
44+
__[deployment](deployment/)__
45+
46+
The [deployment](deployment) folder contains the scripts for deploying a Kubernetes cluster for SQL Server 2019 big data cluster.
3647

3748
__[machine-learning](machine-learning/)__
3849

39-
### SQL Server ML Services on master instance
40-
Connect to the master instance in your SQL Server big data cluster and execute the steps in *machine-learning/sql/book-category-r-ml.sql*.
50+
SQL Server 2016 added support executing R scripts from T-SQL. SQL Server 2017 added support for executing Python scripts from T-SQL. SQL Server 2019 adds support for executing Java code from T-SQL. SQL Server 2019 big data cluster adds support for executing Spark code inside the big data cluster.
51+
52+
### SQL Server Machine Learning Services
53+
The [machine-learning\sql](machine-learning\sql) folder contains the sample SQL scripts that show how to invoke R, Python, and Java code from T-SQL.
4154

42-
### Spark ML
43-
Connect to the SQL Server big data cluster endpoint, and run the notebook files *machine-learning/spark/1-data-prep.ipynb* and *machine-learning/spark/2-build-ml-model.ipynb* cell by cell.
55+
### Spark Machine Learning
56+
The [machine-learning\spark](machine-learning\spark) folder contains the Spark samples.
Lines changed: 18 additions & 14 deletions
Original file line numberDiff line numberDiff line change
@@ -1,17 +1,18 @@
11
@echo off
22
REM bootstrap sample database CMD script
33
setlocal enableextensions
4+
setlocal enabledelayedexpansion
45
set CLUSTER_NAMESPACE=%1
56
set SQL_MASTER_IP=%2
67
set SQL_MASTER_SA_PASSWORD=%3
7-
set KNOX_IP=%3
8-
set KNOX_PASSWORD=%4
8+
set KNOX_IP=%4
9+
set KNOX_PASSWORD=%5
910
set STARTUP_PATH=%~dp0
11+
set TMP_DIR_NAME=%~nx0
1012

1113
if NOT DEFINED CLUSTER_NAMESPACE goto :usage
1214
if NOT DEFINED SQL_MASTER_IP goto :usage
1315
if NOT DEFINED SQL_MASTER_SA_PASSWORD goto :usage
14-
if NOT DEFINED BACKUP_FILE_PATH goto :usage
1516
if NOT DEFINED KNOX_IP goto :usage
1617
if NOT DEFINED KNOX_PASSWORD set KNOX_PASSWORD=%SQL_MASTER_SA_PASSWORD%
1718

@@ -23,49 +24,52 @@ for %%F in (sqlcmd.exe bcp.exe kubectl.exe curl.exe) do (
2324
)
2425

2526
pushd "%tmp%"
27+
md %TMP_DIR_NAME%
2628
echo Downloading sample database backup file...
27-
curl -G "https://sqlchoice.blob.core.windows.net/sqlchoice/static/tpcxbb_1gb.bak" -o tpcxbb_1gb.bak
29+
%DEBUG% curl -G "https://sqlchoice.blob.core.windows.net/sqlchoice/static/tpcxbb_1gb.bak" -o tpcxbb_1gb.bak
2830

2931
REM Copy the backup file, restore the database, create necessary objects and data file
3032
echo Copying database backup file...
3133
%DEBUG% kubectl cp tpcxbb_1gb.bak mssql-master-pool-0:/var/opt/mssql/data -c mssql-server -n %CLUSTER_NAMESPACE% || goto exit
3234

3335
del tpcxbb_1gb.bak >NUL
34-
popd
3536

3637
echo Configuring sample database...
37-
%DEBUG% sqlcmd -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -i "%STARTUP_PATH%bootstrap-sample-db.sql" -o "%STARTUP_PATH%bootstrap.out" -I -b || goto exit
38+
%DEBUG% sqlcmd -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -i "%STARTUP_PATH%bootstrap-sample-db.sql" -o "bootstrap.out" -I -b || goto exit
3839

3940
for %%F in (web_clickstreams inventory customer) do (
4041
echo Exporting %%F data...
4142
if /i %%F EQU web_clickstreams (set DELIMITER=,) else (SET DELIMITER=^|)
42-
%DEBUG% bcp sales.dbo.%%F out "%STARTUP_PATH%%%F.csv" -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -c -t"%DELIMITER%" -o "%STARTUP_PATH%%%F.out" -e "%STARTUP_PATH%%%F.err" || goto exit
43+
%DEBUG% bcp sales.dbo.%%F out "%STARTUP_PATH%%%F.csv" -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -c -t"!DELIMITER!" -o "%%F.out" -e "%%F.err" || goto exit
4344
)
4445

4546
echo Exporting product_reviews data...
46-
%DEBUG% bcp "select pr_review_sk, replace(replace(pr_review_content, ',', ';'), '\"', '') from sales.dbo.product_reviews" queryout "%STARTUP_PATH%product_reviews.csv" -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -c -t, -o "%STARTUP_PATH%product_reviews.out" -e "%STARTUP_PATH%product_reviews.err" || goto exit
47+
%DEBUG% bcp "select pr_review_sk, replace(replace(pr_review_content, ',', ';'), char(34), '') as pr_review_content from sales.dbo.product_reviews" queryout "%TMP_DIR_NAME%product_reviews.csv" -S %SQL_MASTER_INSTANCE% -Usa -P%SQL_MASTER_SA_PASSWORD% -c -t, -o "product_reviews.out" -e "product_reviews.err" || goto exit
4748

4849
REM Copy the data file to HDFS
49-
pushd "%STARTUP_PATH%"
5050
echo Uploading web_clickstreams data to HDFS...
5151
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/clickstream_data?op=MKDIRS" || goto exit
52-
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/clickstream_data/web_clickstreams.csv?op=create" -H "Content-Type: application/octet-stream" -T "web_clickstreams.csv" || goto exit
52+
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/clickstream_data/web_clickstreams.csv?op=create&overwrite=true&noredirect=true" -H "Content-Type: application/octet-stream" -T "web_clickstreams.csv" || goto exit
5353

54+
echo.
5455
echo Uploading product_reviews data to HDFS...
5556
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/product_review_data?op=MKDIRS" || goto exit
56-
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/product_review_data/product_reviews.csv?op=create" -H "Content-Type: application/octet-stream" -T "product_reviews.csv" || goto exit
57-
:: del /q *.out *.err *.csv
58-
popd
57+
%DEBUG% curl -i -L -k -u root:%KNOX_PASSWORD% -X PUT "https://%KNOX_ENDPOINT%/gateway/default/webhdfs/v1/product_review_data/product_reviews.csv?op=create&overwrite=true&noredirect=true" -H "Content-Type: application/octet-stream" -T "product_reviews.csv" || goto exit
5958

59+
%DEBUG% del /q *.out *.err *.csv
60+
61+
popd
62+
%DEBUG% rd /q "%tmp%\%TMP_DIR_NAME%"
6063
endlocal
6164
exit /b 0
6265
goto :eof
6366

6467
:exit
6568
echo Bootstrap of the sample database failed.
69+
echo Output and error files are in directory [%TMP%\%TMP_DIR_NAME%].
6670
exit /b 1
6771

6872
:usage
69-
echo USAGE: %0 ^<CLUSTER_NAMESPACE^> ^<SQL_MASTER_IP^> ^<SQL_MASTER_SA_PASSWORD^> ^<BACKUP_FILE_PATH^> ^<KNOX_IP^> [^<KNOX_PASSWORD^>]
73+
echo USAGE: %0 ^<CLUSTER_NAMESPACE^> ^<SQL_MASTER_IP^> ^<SQL_MASTER_SA_PASSWORD^> ^<KNOX_IP^> [^<KNOX_PASSWORD^>]
7074
echo Default ports are assumed for SQL Master instance ^& Knox gateway.
7175
exit /b 0

samples/features/sql-big-data-cluster/bootstrap-sample-db.sh

Lines changed: 23 additions & 11 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,10 @@
11
#!/bin/bash
22
set -e
33
set -o pipefail
4+
STARTUP_PATH=$(dirname $0)
5+
TMP_DIR_NAME=$(basename $0)
46
USAGE_MESSAGE="USAGE: $0 <CLUSTER_NAMESPACE> <SQL_MASTER_IP> <SQL_MASTER_SA_PASSWORD> <KNOX_IP> [<KNOX_PASSWORD>]"
5-
ERROR_MESSAGE="Bootstrap of the sample database failed."
7+
ERROR_MESSAGE="Bootstrap of the sample database failed. Output and error files are in directory [/tmp/$TMP_DIR_NAME]."
68

79
# Print usage if mandatory parameters are missing
810
: "${1:?$USAGE_MESSAGE}"
@@ -26,41 +28,51 @@ KNOX_ENDPOINT=$KNOX_IP:30443
2628

2729
for util in sqlcmd.exe bcp.exe kubectl.exe curl.exe
2830
do
29-
echo Verifying $util is in path & which $util 1>NUL 2>NUL || (echo Unable to locate $util && exit 1)
31+
echo Verifying $util is in path & which $util 1>/dev/nul 2>/dev/nul || (echo Unable to locate $util && exit 1)
3032
done
3133

3234
# Copy the backup file, restore the database, create necessary objects and data file
3335
pushd "/tmp"
36+
$DEBUG mkdir "$TMP_DIR_NAME"
3437
echo Downloading sample database backup file...
3538
$DEBUG curl -G "https://sqlchoice.blob.core.windows.net/sqlchoice/static/tpcxbb_1gb.bak" -o tpcxbb_1gb.bak
3639

3740
echo Copying database backup file...
3841
$DEBUG kubectl cp tpcxbb_1gb.bak mssql-master-pool-0:/var/opt/mssql/data -c mssql-server -n $CLUSTER_NAMESPACE || (echo $ERROR_MESSAGE && exit 1)
39-
rm tpcxbb_1gb.bak
40-
popd
42+
$DEBUG rm tpcxbb_1gb.bak
4143

4244
echo Configuring sample database...
4345
# WSL ex: "/mnt/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/130/Tools/Binn/SQLCMD.EXE"
44-
$DEBUG sqlcmd -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -i "bootstrap-sample-db.sql" -o "bootstrap.out" -I -b || (echo $ERROR_MESSAGE && exit 2)
46+
$DEBUG sqlcmd -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -i "$STARTUP_PATH\bootstrap-sample-db.sql" -o "bootstrap.out" -I -b || (echo $ERROR_MESSAGE && exit 2)
4547

46-
for table in web_clickstreams inventory
48+
for table in web_clickstreams inventory customer
4749
do
4850
echo Exporting $table data...
51+
if [ $table == web_clickstreams ]
52+
then
53+
DELIMITER=,
54+
else
55+
DELIMITER="|"
56+
fi
4957
# WSL ex: "/mnt/c/Program Files/Microsoft SQL Server/Client SDK/ODBC/130/Tools/Binn/bcp.exe"
50-
$DEBUG bcp sales.dbo.$table out "$table.csv" -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -c -t, -e "$table.err" || (echo $ERROR_MESSAGE && exit 3)
58+
$DEBUG bcp sales.dbo.$table out "$table.csv" -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -c -t"$DELIMITER" -e "$table.err" || (echo $ERROR_MESSAGE && exit 3)
5159
done
5260

5361
echo Exporting product_reviews data...
54-
$DEBUG bcp "select pr_review_sk, replace(replace(pr_review_content, ',', ';'), '\"', '') from sales.dbo.product_reviews" queryout "product_reviews.csv" -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -c -t, -e "product_reviews.err" || (echo $ERROR_MESSAGE && exit 3)
62+
$DEBUG bcp "select pr_review_sk, replace(replace(pr_review_content, ',', ';'), char(34), '') as pr_review_content from sales.dbo.product_reviews" queryout "product_reviews.csv" -S $SQL_MASTER_INSTANCE -Usa -P$SQL_MASTER_SA_PASSWORD -c -t, -e "product_reviews.err" || (echo $ERROR_MESSAGE && exit 3)
5563

5664
# Copy the data file to HDFS
5765
echo Uploading web_clickstreams data to HDFS...
5866
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/clickstream_data?op=MKDIRS" || (echo $ERROR_MESSAGE && exit 4)
59-
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/clickstream_data/web_clickstreams.csv?op=create" -H 'Content-Type: application/octet-stream' -T "web_clickstreams.csv" || (echo $ERROR_MESSAGE && exit 5)
67+
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/clickstream_data/web_clickstreams.csv?op=create&overwrite=true&noredirect=true" -H 'Content-Type: application/octet-stream' -T "web_clickstreams.csv" || (echo $ERROR_MESSAGE && exit 5)
6068

69+
echo
6170
echo Uploading product_reviews data to HDFS...
6271
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/product_review_data?op=MKDIRS" || (echo $ERROR_MESSAGE && exit 6)
63-
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/product_review_data/product_reviews.csv?op=create" -H "Content-Type: application/octet-stream" -T "product_reviews.csv" || (echo $ERROR_MESSAGE && exit 7)
72+
$DEBUG curl -i -L -k -u root:$KNOX_PASSWORD -X PUT "https://$KNOX_ENDPOINT/gateway/default/webhdfs/v1/product_review_data/product_reviews.csv?op=create&overwrite=true&noredirect=true" -H "Content-Type: application/octet-stream" -T "product_reviews.csv" || (echo $ERROR_MESSAGE && exit 7)
73+
74+
$DEBUG rm -f *.out *.err *.csv
75+
popd
6476

65-
# rm -f *.out *.err *.csv
77+
$DEBUG rmdir "/tmp/$TMP_DIR_NAME"
6678
exit

samples/features/sql-big-data-cluster/bootstrap-sample-db.sql

Lines changed: 7 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -25,11 +25,14 @@ IF NOT EXISTS(SELECT * FROM sys.external_data_sources WHERE name = 'SqlStoragePo
2525
WITH (LOCATION = 'sqlhdfs://service-mssql-controller:8080');
2626
GO
2727

28-
-- Create view used for ML services training stored procedure
29-
CREATE OR ALTER VIEW [dbo].[web_clickstreams_book_clicks]
28+
-- Create view used for ML services training and scoring stored procedures
29+
CREATE OR ALTER VIEW [dbo].[web_clickstreams_book_clicks]
3030
AS
3131
SELECT
32-
q.clicks_in_category,
32+
/* There is a bug in TPCx-BB data generator which results in data where all users have purchased books.
33+
As a result, we cannot use the data as is for ML training purposes. So we will treat users with 1-5 clicks
34+
in the book category as not interested in books. */
35+
CASE WHEN q.clicks_in_category < 6 THEN 0 ELSE q.clicks_in_category END AS clicks_in_category,
3336
CASE WHEN cd.cd_education_status IN ('Advanced Degree', 'College', '4 yr Degree', '2 yr Degree') THEN 1 ELSE 0 END AS college_education,
3437
CASE WHEN cd.cd_gender = 'M' THEN 1 ELSE 0 END AS male,
3538
COALESCE(cd.cd_credit_rating, 'Unknown') as cd_credit_rating,
@@ -62,15 +65,4 @@ AS
6265
) AS q
6366
INNER JOIN customer as c ON q.wcs_user_sk = c.c_customer_sk
6467
INNER JOIN customer_demographics as cd ON c.c_current_cdemo_sk = cd.cd_demo_sk;
65-
GO
66-
67-
-- Create table for storing the machine learning models
68-
IF NOT EXISTS(SELECT * FROM sys.tables WHERE name = 'sales_models')
69-
CREATE TABLE sales_models (
70-
model_name varchar(100) NOT NULL PRIMARY KEY,
71-
model varbinary(max) NOT NULL,
72-
model_native varbinary(max) NULL,
73-
created_by nvarchar(300) NOT NULL DEFAULT(SYSTEM_USER),
74-
create_time datetime2 NOT NULL DEFAULT(SYSDATETIME())
75-
);
76-
GO
68+
GO

samples/features/sql-big-data-cluster/data-virtualization/oracle/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Data virtualization in SQL Server 2019
22

3-
**Applies to: SQL Server 2019 on Windows or Linux, SQL Server 2019 big data cluster**
3+
***Applies to:*** SQL Server 2019 on Windows or Linux, SQL Server 2019 big data cluster
44

55
SQL Server 2019 introduces new ODBC connectors to data sources like SQL Server, Oracle, MongoDB and Teradata. These connectors can be used from stand-alone SQL Server 2019 on Windows or Linux or SQL Server 2019 big data cluster.
66

Lines changed: 3 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,9 @@
11
# Oracle setup
22

3-
This folder contains scripts that can be executed on Oracle server to create the necessary objects for data virtualization in SQL Server 2019 big data cluster.
3+
This folder contains scripts that can be executed on Oracle server to create the necessary objects for data virtualization in SQL Server 2019+ or SQL Server 2019+ big data cluster.
44

55
# Instructions
66

7-
1. Connect to Oracle instance.
7+
***Before you begin***, you need the Oracle instance name and credentials.
88

9-
1. Execute the [sales-user.sql](sales-user.sql). This script creates the sample user. If there is name conflict please change the script user/credentials.
10-
11-
1. Execute the [inventory.sql](inventory.sql). This script creates the inventory table.
9+
1. Execute the [bootstrap-oracle.cmd](bootstrap-oracle.cmd) to create the necessary objects in Oracle

samples/features/sql-big-data-cluster/data-virtualization/oracle/setup/bootstrap-oracle.cmd

Lines changed: 12 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -9,19 +9,21 @@ if NOT DEFINED ORACLE_SERVER goto :usage
99
if NOT DEFINED ORACLE_USER goto :usage
1010
if NOT DEFINED ORACLE_PASSWORD goto :usage
1111

12-
echo Verifying sqlplus.exe is in path & CALL WHERE /Q sqlplus.exe || GOTO exit
13-
echo Verifying sqlldr.exe is in path & CALL WHERE /Q sqlldr.exe || GOTO exit
12+
for %F in (sqlplus.exe sqlldr.exe) do (
13+
echo Verifying %%F is in path & CALL WHERE /Q %%F || GOTO exit
14+
)
1415

15-
echo Creating user & tables...
16-
echo exit | sqlplus -S %ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% @sales-user.sql || GOTO exit
17-
echo exit | sqlplus -S %ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% @inventory.sql || GOTO exit
18-
echo exit | sqlplus -S %ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% @customer.sql || GOTO exit
1916

20-
echo Loading tables data...
21-
sqlldr CONTROL=inventory.ctl userid=%ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% || GOTO exit
22-
sqlldr CONTROL=customer.ctl userid=%ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% || GOTO exit
17+
for %%F in (sales-user.sql inventory.sql customer.sql) do (
18+
echo Executing [%%F]...
19+
echo exit | sqlplus -S %ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% @sales-user.sql || GOTO exit
20+
)
21+
22+
for %%F in (inventory.ctl customer.ctl) do (
23+
echo Loading [%%F]...
24+
sqlldr CONTROL=%%F userid=%ORACLE_USER%/%ORACLE_PASSWORD%@%ORACLE_SERVER% || GOTO exit
25+
)
2326

24-
:: del /q *.out *.err *.csv
2527
endlocal
2628
exit /b 0
2729
goto :eof

0 commit comments

Comments
 (0)