Skip to content

Commit 0f8e6e9

Browse files
Merge pull request #3 from IBM/using-docker-image
update the README.md with instructions on how to use Db2 Docker Image instead of DB2 on Warehouse
2 parents 3e07764 + d6c3aba commit 0f8e6e9

File tree

3 files changed

+176
-27
lines changed

3 files changed

+176
-27
lines changed

README.md

Lines changed: 114 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -13,36 +13,141 @@ This is an application which uses Node.js to connect to IBM Db2 Warehouse on Clo
1313

1414
## Steps
1515

16-
1. [Clone the repo](#1-clone-the-repo)
17-
1. [Create IBM Db2 Warehouse on Cloud](#2-create-ibm-db2-warehouse-on-cloud)
18-
1. [Create schema and tables](#3-create-schema-and-tables)
19-
1. [Add Db2 credentials to .env file](#4-add-db2-credentials-to-env-file)
20-
1. [Run the application](#5-run-the-application)
16+
1. [Clone The Repo](#1-clone-the-repo)
17+
2. [Create an IBM Db2 Instance](#2-create-an-ibm-db2-instance)
18+
3. [Create Schema and Tables](#3-create-schema-and-tables)
19+
4. [Add Db2 Credentials to .env File](#4-add-db2-credentials-to-env-file)
20+
5. [Run The Application](#5-run-the-application)
2121

2222
### 1. Clone the repo
2323

2424
```bash
2525
git clone https://github.com/IBM/crud-using-nodejs-and-db2.git
2626
```
2727

28-
### 2. Create IBM Db2 Warehouse on Cloud
28+
### 2. Create an IBM Db2 Instance
29+
30+
Once we have cloned our repository, the next thing we have to do is create our database that will hold our house sales data. There are two ways we can create our database. One way is creating IBM Db2 Warehouse on Cloud. This database will be hosted on the cloud. However, if you perfer to have your database on premise or locally, we can also use the Db2 Docker Image.
31+
32+
Choose which type of database you would like and follow the corresponding instructions:
33+
34+
1. [Create IBM Db2 Warehouse on Cloud](#2a-create-ibm-db2-warehouse-on-cloud)
35+
2. [Create IBM Db2 Database Locally Using Docker Image](#2b-create-an-ibm-db2-on-premise-database)
36+
37+
#### 2a. Create IBM Db2 Warehouse on Cloud
2938

3039
Create the Db2 Warehouse on Cloud service and make sure to note the credentials using the following link:
3140

3241
* [**IBM Db2 Warehouse on Cloud**](https://cloud.ibm.com/catalog/services/db2-warehouse)
3342

34-
### 3. Create schema and tables
43+
#### 2b. Create an IBM Db2 On Premise Database
44+
45+
Instead of creating the Db2 Warehouse on Cloud service, we can also have our database instantiated locally by using the free IBM Db2 Docker Image.
46+
47+
Prerequisite:
48+
49+
* A [Docker](https://www.docker.com) account
50+
* [Docker Desktop](https://www.docker.com/products/docker-desktop) installed on your machine
51+
* Logging into your Docker account on Docker Desktop
52+
53+
Steps to get your db2 running locally:
54+
55+
* Create a folder name `db2`
56+
* Open a terminal window and make sure your current directory is the same as where your `db2` is located
57+
* Run the commands
58+
59+
```bash
60+
docker pull ibmcom/db2
61+
62+
docker run -itd --name mydb2 --privileged=true -p 50000:50000 -e LICENSE=accept -e DB2INST1_PASSWORD=hackathon -e DBNAME= homesalesdb -v db2:/database ibmcom/db2
63+
64+
docker exec -ti mydb2 bash -c "su - db2inst1"
65+
```
66+
67+
Once this is done, it will create a db2 docker container with the follow customizations:
68+
69+
* IP Address/Domain: `localhost`
70+
* Port: `50000`
71+
* Database name: `homesalesdb`
72+
* Username: `db2inst1`
73+
* Password: `hackathon`
74+
75+
76+
### 3. Create Schema and Tables
77+
Now that we have created our databases, we need to import the data from the csv file into our database. We will be creating a schema called `DB2WML`. The two tables we will create are `HOME_SALES` and `HOME_ADDRESS`. `HOME_SALES` will store the data we retrieve from our csv file. `HOME_ADDRESS` is going to be the addresses associated with each home.
78+
79+
Depending on which type you have (Cloud or On-Premise), the steps will be a little different. Please follow the corresponding steps:
80+
81+
1. [Create Schema and Tables for IBM Db2 Warehouse on Cloud](#3a-create-schema-and-tables-for-ibm-db2-warehouse-on-cloud)
82+
2. [Create Schema and Tables for IBM Db2 Docker Image](#3b-create-schema-and-tables-for-ibm-db2-docker-image)
83+
84+
85+
#### 3a. Create Schema and Tables for IBM Db2 Warehouse on Cloud
3586

3687
In the Db2 warehouse resource page, click on `Manage` and go to DB2 console by clicking the button `Open Console`. In the console do the following to load your data.
3788

3889
* Click `Load` from the hamburger menu.
3990
* Click `Browse files` or you can drag files, select the [data/home-sales-training-data.csv](data/home-sales-training-data.csv) and click `Next`
4091
* Choose existing schema or create a new one named `DB2WML` by clicking `+ New Schema`
4192
* Create a new table named `HOME_SALES` by clicking `+ New Table` on the schema you created and click `Next`
42-
* Make sure the column names and data types displayed are correct, then cick `Next`
93+
* Make sure the column names and data types displayed are correct, then click `Next`
4394
* Click `Begin Load` to load the data
4495

45-
Once this is done it will create a table `HOME_SALES` under schema `DB2WML` which will be used by the Node.js application.
96+
We also need to create a table for `HOME_ADDRESS`, which will store the addresses of each house data. We won't be able to use the same instructions we used for `HOME_SALES` since we have no data to load.
97+
98+
* Click `Run SQL` from the hamburger menu.
99+
* Click `Blank`, which will open a blank sql editor
100+
* Run the command
101+
102+
```bash
103+
CREATE TABLE DB2WML.HOME_ADDRESS (ADDRESS1 VARCHAR(50), ADDRESS2 VARCHAR(50), CITY VARCHAR(50), STATE VARCHAR(5), ZIPCODE INTEGER, COUNTRY VARCHAR(50), HOME_ID INTEGER)
104+
```
105+
106+
Once this is done it will create a table `HOME_SALES` and `HOME_ADDRESS ` under schema `DB2WML` which will be used by the Node.js application.
107+
108+
109+
#### 3b. Create Schema and Tables for IBM Db2 Docker Image
110+
111+
Exit out of the container shell by CONTROL-C. Load the sample data into the onprem Db2 database:
112+
113+
```bash
114+
docker cp data/home-sales-training-data.csv mydb2:home-sales-training-data.csv
115+
```
116+
117+
Run the container and enter into the container shell:
118+
119+
```bash
120+
docker exec -ti mydb2 bash -c "su - db2inst1"
121+
```
122+
123+
Steps To Create Schema and Tables:
124+
125+
126+
* Connect to the database `homesalesdb` NOTE: This command may not work for sometime, since the container takes some time to create the database. If this command doesn work, please wait a couple minutes and then try again.
127+
128+
```bash
129+
db2 connect to homesalesdb
130+
```
131+
132+
* Create Schema `DB2WML`
133+
134+
```bash
135+
db2 'CREATE SCHEMA DB2WML'
136+
```
137+
138+
* Create Table `HOME_SALES` and `HOME_ADDRESS` within Schema `DB2WML`
139+
140+
```bash
141+
db2 'CREATE TABLE DB2WML.HOME_SALES (ID SMALLINT, LOTAREA INTEGER, BLDGTYPE VARCHAR(6),HOUSESTYLE VARCHAR(6), OVERALLCOND INTEGER, YEARBUILT INTEGER, ROOFSTYLE VARCHAR(7), EXTERCOND VARCHAR(2), FOUNDATION VARCHAR(6), BSMTCOND VARCHAR(2), HEATING VARCHAR(4), HEATINGQC VARCHAR(2),CENTRALAIR VARCHAR(1), ELECTRICAL VARCHAR(5), FULLBATH INTEGER, HALFBATH INTEGER, BEDROOMABVGR INTEGER, KITCHENABVGR VARCHAR(2), KITCHENQUAL VARCHAR(2), TOTRMSABVGRD INTEGER, FIREPLACES INTEGER, FIREPLACEQU VARCHAR(2), GARAGETYPE VARCHAR(7), GARAGEFINISH VARCHAR(3), GARAGECARS INTEGER, GARAGECOND VARCHAR(2), POOLAREA INTEGER, POOLQC VARCHAR(2), FENCE VARCHAR(6), MOSOLD INTEGER, YRSOLD INTEGER, SALEPRICE INTEGER )'
142+
143+
db2 'CREATE TABLE DB2WML.HOME_ADDRESS (ADDRESS1 VARCHAR(50), ADDRESS2 VARCHAR(50), CITY VARCHAR(50), STATE VARCHAR(5), ZIPCODE INTEGER, COUNTRY VARCHAR(50), HOME_ID INTEGER)'
144+
```
145+
146+
* Load data from CSV file to table `HOME_SALES`
147+
148+
```bash
149+
db2 'IMPORT FROM ../../../home-sales-training-data.csv OF DEL SKIPCOUNT 1 INSERT INTO DB2WML.HOME_SALES'
150+
```
46151

47152
### 4. Add Db2 credentials to .env file
48153

server.js

Lines changed: 57 additions & 17 deletions
Original file line numberDiff line numberDiff line change
@@ -83,7 +83,6 @@ let connStr = "DATABASE="+process.env.DB_DATABASE+";HOSTNAME="+process.env.DB_HO
8383

8484
app.post('/getData', function(request, response){
8585
console.log('GET DATA API CALL:');
86-
console.log(request);
8786
ibmdb.open(connStr, function (err,conn) {
8887
if (err){
8988
return response.json({success:-1, message:err});
@@ -101,7 +100,6 @@ app.post('/getData', function(request, response){
101100

102101
app.post('/getUniqueData', function(request, response){
103102
console.log('GET UNIQUE DATA API CALL:');
104-
console.log(request);
105103
ibmdb.open(connStr, function (err,conn) {
106104
if (err){
107105
return response.json({success:-1, message:err});
@@ -116,7 +114,13 @@ app.post('/getUniqueData', function(request, response){
116114
return response.json({success:-3, message:err});
117115
}
118116
conn.close(function () {
119-
console.log(data2);
117+
console.log(data);
118+
console.log(data2.length);
119+
if (data2.length == 0){
120+
data2[0] = {'ADDRESS1': '', 'ADDRESS2': '','CITY': '','STATE': '','COUNTRY': '','ZIPCODE': '','HOME_ID': data[0]['ID']};
121+
console.log(data2);
122+
}
123+
120124
return response.json({success:1, message:'Data Received!', data:data,data2:data2 });
121125
});
122126
});
@@ -127,28 +131,58 @@ app.post('/getUniqueData', function(request, response){
127131

128132
app.post('/updateDataEntry', function(request, response){
129133
console.log('UPDATE DATA API CALL:');
130-
console.log(request);
131134
ibmdb.open(connStr, function (err,conn) {
132135
if (err){
133136
return response.json({success:-1, message:err});
134137
}
135138

136139

137-
var str2 = "UPDATE DB2WML.HOME_ADDRESS SET ADDRESS1='"+request.body.addressInfo.address1+"',ADDRESS2='"+request.body.addressInfo.address2+"',CITY='"+request.body.addressInfo.city+"',STATE='"+request.body.addressInfo.state+"',COUNTRY='"+request.body.addressInfo.country+"' WHERE HOME_ID="+request.body.id+";";
140+
var str2 = "UPDATE DB2WML.HOME_ADDRESS SET ADDRESS1='"+request.body.addressInfo.address1+"',ADDRESS2='"+request.body.addressInfo.address2+"',CITY='"+request.body.addressInfo.city+"',STATE='"+request.body.addressInfo.state+"',COUNTRY='"+request.body.addressInfo.country+"',ZIPCODE="+request.body.addressInfo.zipcode+" WHERE HOME_ID="+request.body.id+";";
141+
142+
var str4 = "INSERT INTO DB2WML.HOME_ADDRESS (ADDRESS1, ADDRESS2, CITY, STATE,ZIPCODE, COUNTRY,HOME_ID) VALUES ('"+request.body.addressInfo.address1+"', '"+request.body.addressInfo.address2+"', '"+request.body.addressInfo.city+"', '"+request.body.addressInfo.state+"', "+request.body.addressInfo.zipcode+", '"+request.body.addressInfo.country+"', "+request.body.id+");";
143+
144+
138145

139146
var str = "UPDATE DB2WML.HOME_SALES SET LOTAREA="+request.body.data.lotArea+", YEARBUILT="+request.body.data.yearBuilt+", BLDGTYPE='"+request.body.data.bldgType+"',HOUSESTYLE='"+request.body.data.houseStyle+"',OVERALLCOND="+request.body.data.overallCond+",ROOFSTYLE='"+request.body.data.roofStyle+"',EXTERCOND='"+request.body.data.exterCond+"',FOUNDATION='"+request.body.data.foundation+"',BSMTCOND='"+request.body.data.bsmtCond+"',HEATING='"+request.body.data.heating+"',HEATINGQC='"+request.body.data.heatingQC+"',CENTRALAIR='"+request.body.data.centralAir+"',ELECTRICAL='"+request.body.data.electrical+"',FULLBATH="+request.body.data.fullBath+",HALFBATH="+request.body.data.halfBath+",BEDROOMABVGR="+request.body.data.bedroomAbvGr+",KITCHENABVGR="+request.body.data.kitchenAbvGr+",KITCHENQUAL='"+request.body.data.kitchenQual+"',TOTRMSABVGRD="+request.body.data.tempotRmsAbvGrd+",FIREPLACES="+request.body.data.fireplaces+",FIREPLACEQU='"+request.body.data.fireplaceQu+"',GARAGETYPE='"+request.body.data.garageType+"',GARAGEFINISH='"+request.body.data.garageFinish+"',GARAGECARS="+request.body.data.garageCars+",GARAGECOND='"+request.body.data.garageCond+"',POOLAREA="+request.body.data.poolArea+",POOLQC='"+request.body.data.poolQC+"',FENCE='"+request.body.data.fence+"',MOSOLD="+request.body.data.moSold+",YRSOLD="+request.body.data.yrSold+",SALEPRICE="+request.body.data.salePrice+" WHERE ID="+request.body.id+";";
140147

148+
var str3 = "SELECT * FROM DB2WML.HOME_ADDRESS WHERE HOME_ID="+request.body.id + ";";
149+
141150
conn.query(str, function (err, data) {
142151
if (err){
143152
return response.json({success:-2, message:err});
144153
}
145-
conn.query(str2, function (err, data) {
154+
conn.query(str3, function (err, data2) {
155+
console.log(data);
146156
if (err){
147157
return response.json({success:-3, message:err});
148158
}
149-
conn.close(function () {
150-
return response.json({success:1, message:'Data Edited!'});
151-
});
159+
else{
160+
if (data2.length == 0 ){
161+
conn.query(str4, function (err, data) {
162+
if (err){
163+
return response.json({success:-2, message:err});
164+
}
165+
else{
166+
conn.close(function () {
167+
return response.json({success:1, message:'Data Edited!'});
168+
});
169+
}
170+
});
171+
}
172+
else{
173+
conn.query(str2, function (err, data) {
174+
if (err){
175+
return response.json({success:-2, message:err});
176+
}
177+
else{
178+
conn.close(function () {
179+
return response.json({success:1, message:'Data Edited!'});
180+
});
181+
}
182+
});
183+
}
184+
}
185+
152186
});
153187
});
154188
});
@@ -227,14 +261,20 @@ app.get('/predict', function(request, response){
227261

228262
app.post('/geocode', function(request, response){
229263
// Using callback
230-
geocoder.geocode(request.body.address1 + ", " + request.body.city + ", " + request.body.state + ", " + request.body.zipcode, function(err, res) {
231-
if (err){
232-
return response.json({success:-2, message:err});
233-
}
234-
else{
235-
return response.json({success:1, message:"WE DID IT", data:res} );
236-
}
237-
});
264+
if (request.body.address1 == ''){
265+
return response.json({success:1, message:"no address"});
266+
}
267+
else {
268+
geocoder.geocode(request.body.address1 + ", " + request.body.city + ", " + request.body.state + ", " + request.body.zipcode, function(err, res) {
269+
if (err){
270+
return response.json({success:-2, message:err});
271+
}
272+
else{
273+
return response.json({success:1, message:"WE DID IT", data:res} );
274+
}
275+
});
276+
}
277+
238278
})
239279

240280

src/app/edit-data/edit-data.component.ts

Lines changed: 5 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -109,7 +109,6 @@ export class EditDataComponent implements OnInit {
109109
console.log('rowID: ' + this.rowID);
110110
})
111111
this.getDataEntry();
112-
console.log(this.model);
113112
}
114113

115114

@@ -264,6 +263,7 @@ export class EditDataComponent implements OnInit {
264263
console.log(data['message']);
265264
}
266265
else{
266+
console.log(data['message']);
267267
localStorage.setItem("dataUpdated","true");
268268
this._router.navigate(['/viewData']);
269269
}
@@ -279,10 +279,14 @@ export class EditDataComponent implements OnInit {
279279
console.log(data['message']);
280280
}
281281
else{
282+
283+
282284
this.data = data['data'][0];
283285
this.data2 = data['data2'][0];
284286
this.showMessage = false;
285287
this.showData = true;
288+
console.log(this.data2);
289+
console.log(this.data);
286290

287291
}
288292
})

0 commit comments

Comments
 (0)