Skip to content

Commit 2d8f4ca

Browse files
MattCollins84glynnbird
authored andcommitted
CMS and other additions (#47)
* Start of adding CMS page * CMS delete works * CMS first attempt feature complete * First attempt at API Reference * tidy up API ref * cache parameter * Data based examples on input forms, Inline deletes, Clickable Facets, Some tidy up * Cleaner local configuration * Readme tidy up * Update README.md * Use Passport to allow basic HTTP auth in lockdown mode * Update README.md
1 parent 7bc3f22 commit 2d8f4ca

21 files changed

+1517
-70
lines changed

.gitignore

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -8,3 +8,4 @@ start.sh
88
.project
99
npm-debug.log
1010
.vscode
11+
todo.txt

README.md

Lines changed: 179 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
# Overview: Simple Search Service
22

3-
Simple Search Service is an IBM Bluemix app that lets you quickly create a faceted search engine, exposing an API you can use to bring search into your own apps. The service also creates a website that lets you preview the API and test it against your own data.
3+
Simple Search Service is an IBM Bluemix app that lets you quickly create a faceted search engine, exposing an API you can use to bring search into your own apps. The service also creates a website that lets you preview the API and test it against your own data as well as manage your data via a simple CMS.
44

55
Once deployed, use the browser to upload CSV or TSV data. Specify the fields to facet, and the service handles the rest.
66

@@ -12,14 +12,16 @@ The application uses these Bluemix services:
1212
* a Cloudant database
1313
* a Redis in-memory database from Compose.io (Optional)
1414

15-
Once the data is uploaded, a CORS-enabled, cached API endpoint is available at `<your domain name>/search`. The endpoint takes advantage of Cloudant's built-in integration for Lucene full-text indexing. Here's what you get:
15+
Once the data is uploaded, you can use the UI to browse and manage your data via the integrated CMS. Additionally, a CORS-enabled, cached API endpoint is available at `<your domain name>/search`. The endpoint takes advantage of Cloudant's built-in integration for Lucene full-text indexing. Here's what you get:
1616

1717
* fielded search - `?q=colour:black+AND+brand:fender`
1818
* free-text search - `?q=black+fender+strat`
1919
* pagination - `?q=black+fender+strat&bookmark=<xxx>`
2020
* faceting
2121
* caching of popular searches
2222

23+
You can use this along with the rest of the API to integrate the Simple Search Service into your apps. For a full API reference, [click here](API Reference.md).
24+
2325
While this app is a demo to showcase how easily you can build an app on Bluemix using Node.js and Cloudant, it also provides a mature search API that scales with the addition of multiple Simple Search Service nodes and a centralized cache using Redis by Compose.io. In fact, a similar architecture powers the search experience in the Bluemix services catalog.
2426

2527
A more detailed walkthrough of using Simple Search Service is available [here](https://developer.ibm.com/clouddataservices/2016/01/21/introducing-simple-faceted-search-service/).
@@ -42,13 +44,18 @@ The fastest way to deploy this application to Bluemix is to click the **Deploy t
4244

4345
Clone this repository then run `npm install` to add the Node.js libraries required to run the app.
4446

45-
Then create an environment variable that store your Cloudant URL:
47+
Then create some environment variables that contain your Cloudant URL, and optionally, your Redis details:
4648

4749
```sh
48-
export SSS_URL='https://USERNAME:PASSWORD@HOSTNAME'
50+
# Cloudant URL
51+
export SSS_CLOUDANT_URL='https://<USERNAME>:<PASSWORD>@<HOSTNAME>'
52+
53+
# Redis Host and password
54+
export SSS_REDIS_HOST='127.0.0.1:6379'
55+
export SSS_REDIS_PASSWORD='redispassword'
4956
```
5057

51-
replacing the `USERNAME`, `PASSWORD` and `HOSTNAME` placeholders for your own Cloudant account's details.
58+
replacing the `USERNAME`, `PASSWORD` and `HOSTNAME` placeholders for your own Cloudant account's details. If your Redis server does not require a password, do not set the `SSS_REDIS_PASSWORD` environment variable.
5259

5360
Then run:
5461

@@ -58,7 +65,7 @@ node app.js
5865

5966
## Lockdown mode
6067

61-
If you have uploaded your content into the Simple Search Service but now want only the `/search` endpoint to continue working, then you can enable "Lockdown mode".
68+
If you have uploaded your content into the Simple Search Service but now want only the `/search` endpoint to be available publicly, you can enable "Lockdown mode".
6269

6370
Simply set an environment variable called `LOCKDOWN` to `true` before running the Simple Search Service:
6471

@@ -69,9 +76,172 @@ node app.js
6976

7077
or set a custom environment variable in Bluemix.
7178

72-
When lockdown mode is detected, all web requests will be get a `403` response except the `/search` endpoint which will continue to work. This prevents your data being modified until lockdown mode is switched off again, by removing the environment variable.
79+
When lockdown mode is detected, all web requests will be get a `401 Unauthorised` response, except for the `/search` endpoint which will continue to work. This prevents your data being modified until lockdown mode is switched off again, by removing the environment variable.
80+
81+
If you wish to get access to the Simple Search Service whilst in lockdown mode, you can enable basic HTTP authentication by setting two more environment variables:
82+
83+
* `SSS_LOCKDOWN_USERNAME`
84+
* `SSS_LOCKDOWN_PASSWORD`
85+
86+
When these are set, you are able to bypass lockdown mode by providing a matching username and password. If you access the UI, your browser will prompt you for these details. If you want to access the API you can provide the username and password as part of your request:
87+
88+
```sh
89+
curl -X GET 'http://<yourdomain>/row/4dac2df712704b397f1b64a1c8e25033' --user <username>:<password>
90+
```
91+
92+
## API Reference
93+
The Simple Search Service has an API that allows you to manage your data outside of the provided UI. Use this to integrate the SImple Search Service with your applications.
94+
95+
### Search
96+
97+
Search is provided by the `GET /search` endpoint.
98+
99+
#### Fielded Search
100+
Search on any of the indexed fields in your dataset using fielded search.
101+
102+
```bash
103+
# Return any docs where colour=black
104+
GET /search?q=colour:black
105+
```
106+
107+
Fielded search uses [Cloudant Search](https://cloudant.com/for-developers/search/).
108+
109+
#### Free-text Search
110+
Search across all fields in your dataset using free-text search.
111+
112+
```bash
113+
# Return any docs 'black' is mentioned
114+
GET /search?q=black
115+
```
116+
117+
#### Pagination
118+
Get the next page of results using the `bookmark` parameter. This is provided in all results from the `/search` endpoint (see example responses below). Pass this in to the next search (with the same query parameters) to return the next set of results.
119+
120+
```bash
121+
# Return the next set of docs where 'black' is mentioned
122+
GET /search?q=black&bookmark=<...>
123+
```
124+
125+
It is possible to alter the amount of results returned using the `limit` parameter.
126+
127+
```bash
128+
# Return the next set of docs where 'black' is mentioned, 10 at a time
129+
GET /search?q=black&bookmark=<...>&limit=10
130+
```
131+
132+
It is possible to alter whether or not to use the cache via the `cache` parameter (defaults to true).
133+
134+
```bash
135+
# Return the next set of docs where 'black' is mentioned, don't use the cache
136+
GET /search?q=black&bookmark=<...>&cache=false
137+
```
138+
139+
#### Example Response
140+
141+
All searches will respond in the same way.
142+
143+
```
144+
{
145+
"total_rows": 19, // The total number of rows in the dataset
146+
"bookmark": "g1AAAA...JjFkA0kLVvg", // bookmark, for pagination
147+
"rows": [ // the rows returned in this response
148+
{ ... },
149+
{ ... },
150+
{ ... },
151+
{ ... },
152+
{ ... },
153+
{ ... },
154+
{ ... },
155+
{ ... },
156+
{ ... },
157+
{ ... },
158+
{ ... },
159+
{ ... },
160+
{ ... },
161+
{ ... },
162+
{ ... },
163+
{ ... },
164+
{ ... },
165+
{ ... },
166+
{ ... }
167+
],
168+
"counts": { // counts of the fields which were selected as facets during import
169+
"type": {
170+
"Black": 19
171+
}
172+
},
173+
"from_cache": true, // did this response come from the cache?
174+
"_ts": 1467108849821
175+
}
176+
```
177+
178+
### Get a specific row
179+
180+
A specific row can be returned using it's unique ID, found in the `_id` field of each row. This is done by using the `GET /row/:id` endpoint.
181+
182+
```bash
183+
GET /row/44d2a49201625252a51d252824932580
184+
```
185+
186+
This will return the JSON representation of this specific row.
187+
188+
### Add a new row
189+
190+
New data can be added a row at a time using the `POST /row` endpoint.
191+
192+
Call this endpoint passing in key/value pairs that match the fields in the existing data. There are __NO__ required fields, and all field types will be enforced. The request will fail if any fields are passed in that do not already exist in the dataset.
193+
194+
```bash
195+
POST /row -d'field_1=value_1&field_n=value_n'
196+
```
197+
198+
The `_id` of the new row will be auto generated and returned in the `id` field of the response.
199+
200+
```json
201+
{
202+
"ok":true,
203+
"id":"22a747412adab2882be7e38a1393f4f2",
204+
"rev":"1-8a23bfa9ee2c88f2ae8dd071d2cafd56"
205+
}
206+
```
207+
208+
### Update an existing row
209+
210+
Exiting data can be updated using the `PUT /row/:id` endpoint.
211+
212+
Call this endpoint passing in key/value pairs that match the fields in the existing data - you must also include the `_id` parameter in the key/value pairs. There are _NO_ required fields, and all field types will be enforced. The request will fail if any fields are passed in that do not already exist in the dataset.
213+
214+
> *Note:* Any fields which are not provided at the time of an update will be removed. Even if a field is not changing, it must always be provided to preserve its value.
215+
216+
The response is similar to that of adding a row, although note that the revision number of the document has increased.
217+
218+
```json
219+
{
220+
"ok":true,
221+
"id":"22a747412adab2882be7e38a1393f4f2",
222+
"rev":"2-6281e0a21ed461659dba6a96d3931ccf"
223+
}
224+
```
225+
226+
### Deleting a row
227+
228+
A specific row can be deleting using it's unique ID, found in the `_id` field of each row. This is done by using the `DELETE /row/:id` endpoint.
229+
230+
```bash
231+
DELETE /row/44d2a49201625252a51d252824932580
232+
```
233+
234+
The response is similar to that of editing a row, although again note that the revision number of the document increased once more.
235+
236+
```json
237+
{
238+
"ok":true,
239+
"id":"22a747412adab2882be7e38a1393f4f2",
240+
"rev":"3-37b4f5c715916bf8f90ed997d57dc437"
241+
}
242+
```
73243

74-
### Privacy Notice
244+
## Privacy Notice
75245

76246
The Simple Search Service web application includes code to track deployments to Bluemix and other Cloud Foundry platforms. The following information is sent to a [Deployment Tracker](https://github.com/IBM-Bluemix/cf-deployment-tracker-service) service on each deployment:
77247

@@ -86,7 +256,7 @@ This data is collected from the `VCAP_APPLICATION` environment variable in IBM B
86256

87257
For manual deploys, deployment tracking can be disabled by removing `require("cf-deployment-tracker-client").track();` from the end of the `app.js` main server file.
88258

89-
#### License
259+
### License
90260

91261
Copyright 2016 IBM Cloud Data Services
92262

0 commit comments

Comments
 (0)