Skip to content

Commit f276587

Browse files
authored
Merge pull request #163 from firebase/next
Jan 16, 2020 release
2 parents 5d6a007 + c7a1bc3 commit f276587

File tree

26 files changed

+636
-105
lines changed

26 files changed

+636
-105
lines changed

auth-mailchimp-sync/README.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -35,7 +35,7 @@ Usage of this extension also requires you to have a Mailchimp account. You are r
3535

3636
**Configuration Parameters:**
3737

38-
* Deployment location: Where should the extension be deployed? For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
38+
* Cloud Functions location: Where do you want to deploy the functions created for this extension?
3939

4040
* Mailchimp API key: What is your Mailchimp API key? To obtain a Mailchimp API key, go to your [Mailchimp account](https://admin.mailchimp.com/account/api/).
4141

auth-mailchimp-sync/extension.yaml

Lines changed: 2 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -67,11 +67,9 @@ resources:
6767
params:
6868
- param: LOCATION
6969
type: select
70-
label: Deployment location
70+
label: Cloud Functions location
7171
description: >-
72-
Where should the extension be deployed? For help selecting a location,
73-
refer to the [location selection
74-
guide](https://firebase.google.com/docs/functions/locations).
72+
Where do you want to deploy the functions created for this extension?
7573
options:
7674
- label: Iowa (us-central1)
7775
value: us-central1

delete-user-data/CHANGELOG.md

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,3 +1,7 @@
1+
## Version 0.1.3
2+
3+
feature - Support deletion of directories (issue #148).
4+
15
## Version 0.1.2
26

37
feature - Add a new param for recursively deleting subcollections in Cloud Firestore (issue #14).

delete-user-data/extension.yaml

Lines changed: 37 additions & 30 deletions
Original file line numberDiff line numberDiff line change
@@ -15,11 +15,11 @@
1515
name: delete-user-data
1616
displayName: Delete User Data
1717
specVersion: v1beta
18-
version: 0.1.2
18+
version: 0.1.3
1919

2020
description:
21-
Deletes data keyed on a userId from Cloud Firestore, Realtime
22-
Database, and/or Cloud Storage when a user deletes their account.
21+
Deletes data keyed on a userId from Cloud Firestore, Realtime Database, and/or
22+
Cloud Storage when a user deletes their account.
2323

2424
license: Apache-2.0
2525
billingRequired: false
@@ -52,9 +52,10 @@ resources:
5252
- name: clearData
5353
type: firebaseextensions.v1beta.function
5454
description:
55-
Listens for user accounts to be deleted from your project's authenticated users,
56-
then removes any associated user data (based on Firebase Authentication's User ID) from
57-
Realtime Database, Cloud Firestore, and/or Cloud Storage.
55+
Listens for user accounts to be deleted from your project's authenticated
56+
users, then removes any associated user data (based on Firebase
57+
Authentication's User ID) from Realtime Database, Cloud Firestore, and/or
58+
Cloud Storage.
5859
properties:
5960
sourceDirectory: .
6061
location: ${LOCATION}
@@ -65,11 +66,12 @@ resources:
6566
params:
6667
- param: LOCATION
6768
type: select
68-
label: Deployment location
69+
label: Cloud Functions location
6970
description: >-
70-
Where should the extension be deployed? You usually want a location close to your database.
71-
For help selecting a location, refer to the
72-
[location selection guide](https://firebase.google.com/docs/functions/locations).
71+
Where do you want to deploy the functions created for this extension?
72+
You usually want a location close to your database or Storage bucket.
73+
For help selecting a location, refer to the [location selection
74+
guide](https://firebase.google.com/docs/functions/locations).
7375
options:
7476
- label: Iowa (us-central1)
7577
value: us-central1
@@ -95,21 +97,23 @@ params:
9597
example: users/{UID},admins/{UID}
9698
required: false
9799
description: >-
98-
Which paths in your Cloud Firestore instance contain user data? Leave empty if
99-
you don't use Cloud Firestore.
100+
Which paths in your Cloud Firestore instance contain user data? Leave
101+
empty if you don't use Cloud Firestore.
100102
101-
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
103+
Enter the full paths, separated by commas. You can represent the User ID
104+
of the deleted user with `{UID}`.
102105
103-
For example, if you have the collections `users` and `admins`, and each collection
104-
has documents with User ID as document IDs, then you can enter `users/{UID},admins/{UID}`.
106+
For example, if you have the collections `users` and `admins`, and each
107+
collection has documents with User ID as document IDs, then you can enter
108+
`users/{UID},admins/{UID}`.
105109
106110
- param: FIRESTORE_DELETE_MODE
107111
type: select
108112
label: Cloud Firestore delete mode
109113
description: >-
110-
(Only applicable if you use the `Cloud Firestore paths` parameter.) How do you want
111-
to delete Cloud Firestore documents? To also delete documents in subcollections,
112-
set this parameter to `recursive`.
114+
(Only applicable if you use the `Cloud Firestore paths` parameter.) How do
115+
you want to delete Cloud Firestore documents? To also delete documents in
116+
subcollections, set this parameter to `recursive`.
113117
options:
114118
- label: Recursive
115119
value: recursive
@@ -124,10 +128,11 @@ params:
124128
example: users/{UID},admins/{UID}
125129
required: false
126130
description: >-
127-
Which paths in your Realtime Database instance contain user data? Leave empty if you
128-
don't use Realtime Database.
131+
Which paths in your Realtime Database instance contain user data? Leave
132+
empty if you don't use Realtime Database.
129133
130-
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
134+
Enter the full paths, separated by commas. You can represent the User ID
135+
of the deleted user with `{UID}`.
131136
132137
For example: `users/{UID},admins/{UID}`.
133138
@@ -140,12 +145,14 @@ params:
140145
Where in Google Cloud Storage do you store user data? Leave empty if you
141146
don't use Cloud Storage.
142147
143-
Enter the full paths, separated by commas. You can represent the User ID of the deleted user with `{UID}`.
144-
You can use `{DEFAULT}` to represent your default bucket.
145-
146-
For example, if you are using your default bucket,
147-
and the bucket has files with the naming scheme `{UID}-pic.png`,
148-
then you can enter `{DEFAULT}/{UID}-pic.png`.
149-
If you also have files in another bucket called `my-awesome-app-logs`,
150-
and that bucket has files with the naming scheme `{UID}-logs.txt`,
151-
then you can enter `{DEFAULT}/{UID}-pic.png,my-awesome-app-logs/{UID}-logs.txt`.
148+
Enter the full paths to files or directories in your Storage buckets,
149+
separated by commas. Use `{UID}` to represent the User ID of the deleted
150+
user, and use `{DEFAULT}` to represent your default Storage bucket.
151+
152+
Here's a series of examples. To delete all the files in your default
153+
bucket with the file naming scheme `{UID}-pic.png`, enter
154+
`{DEFAULT}/{UID}-pic.png`. To also delete all the files in another bucket
155+
called my-app-logs with the file naming scheme `{UID}-logs.txt`, enter
156+
`{DEFAULT}/{UID}-pic.png,my-app-logs/{UID}-logs.txt`. To *also* delete a User
157+
ID-labeled directory and all its files (like `media/{UID}`), enter
158+
`{DEFAULT}/{UID}-pic.png,my-app-logs/{UID}-logs.txt,{DEFAULT}/media/{UID}`.

delete-user-data/functions/lib/index.js

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -91,19 +91,20 @@ const clearStorageData = (storagePaths, uid) => __awaiter(void 0, void 0, void 0
9191
const bucket = bucketName === "{DEFAULT}"
9292
? admin.storage().bucket()
9393
: admin.storage().bucket(bucketName);
94-
const file = bucket.file(parts.slice(1).join("/"));
95-
const bucketFilePath = `${bucket.name}/${file.name}`;
94+
const prefix = parts.slice(1).join("/");
9695
try {
97-
logs.storagePathDeleting(bucketFilePath);
98-
yield file.delete();
99-
logs.storagePathDeleted(bucketFilePath);
96+
logs.storagePathDeleting(prefix);
97+
yield bucket.deleteFiles({
98+
prefix,
99+
});
100+
logs.storagePathDeleted(prefix);
100101
}
101102
catch (err) {
102103
if (err.code === 404) {
103-
logs.storagePath404(bucketFilePath);
104+
logs.storagePath404(prefix);
104105
}
105106
else {
106-
logs.storagePathError(bucketFilePath, err);
107+
logs.storagePathError(prefix, err);
107108
}
108109
}
109110
}));

delete-user-data/functions/src/index.ts

Lines changed: 8 additions & 7 deletions
Original file line numberDiff line numberDiff line change
@@ -92,17 +92,18 @@ const clearStorageData = async (storagePaths: string, uid: string) => {
9292
bucketName === "{DEFAULT}"
9393
? admin.storage().bucket()
9494
: admin.storage().bucket(bucketName);
95-
const file = bucket.file(parts.slice(1).join("/"));
96-
const bucketFilePath = `${bucket.name}/${file.name}`;
95+
const prefix = parts.slice(1).join("/");
9796
try {
98-
logs.storagePathDeleting(bucketFilePath);
99-
await file.delete();
100-
logs.storagePathDeleted(bucketFilePath);
97+
logs.storagePathDeleting(prefix);
98+
await bucket.deleteFiles({
99+
prefix,
100+
});
101+
logs.storagePathDeleted(prefix);
101102
} catch (err) {
102103
if (err.code === 404) {
103-
logs.storagePath404(bucketFilePath);
104+
logs.storagePath404(prefix);
104105
} else {
105-
logs.storagePathError(bucketFilePath, err);
106+
logs.storagePathError(prefix, err);
106107
}
107108
}
108109
});

firestore-bigquery-export/POSTINSTALL.md

Lines changed: 16 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -13,13 +13,15 @@ You can test out this extension right away:
1313
1. Query your **raw changelog table**, which should contain a single log of creating the `bigquery-mirror-test` document.
1414

1515
```
16-
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
16+
SELECT *
17+
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
1718
```
1819
1920
1. Query your **latest view**, which should return the latest change event for the only document present -- `bigquery-mirror-test`.
2021
2122
```
22-
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_latest`
23+
SELECT *
24+
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_latest`
2325
```
2426
2527
1. Delete the `bigquery-mirror-test` document from [Cloud Firestore](https://console.firebase.google.com/project/${param:PROJECT_ID}/database/firestore/data).
@@ -28,9 +30,10 @@ The `bigquery-mirror-test` document will disappear from the **latest view** and
2830
1. You can check the changelogs of a single document with this query:
2931
3032
```
31-
SELECT * FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
32-
WHERE document_name = "bigquery-mirror-test"
33-
ORDER BY TIMESTAMP ASC
33+
SELECT *
34+
FROM `${param:PROJECT_ID}.${param:DATASET_ID}.${param:TABLE_ID}_raw_changelog`
35+
WHERE document_name = "bigquery-mirror-test"
36+
ORDER BY TIMESTAMP ASC
3437
```
3538
3639
### Using the extension
@@ -48,13 +51,17 @@ Note that this extension only listens for _document_ changes in the collection,
4851
4952
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.
5053
51-
The import script can read all existing documents in a Cloud Firestore collection and insert them into the raw changelog table created by this extension. The script adds a special changelog for each document with the operation of `IMPORT` and the timestamp of epoch. This is to ensure that any operation on an imported document supersedes the `IMPORT`
54+
The import script can read all existing documents in a Cloud Firestore collection and insert them into the raw changelog table created by this extension. The script adds a special changelog for each document with the operation of `IMPORT` and the timestamp of epoch. This is to ensure that any operation on an imported document supersedes the `IMPORT`.
5255
53-
**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
56+
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
5457
55-
You may pause and resume the script from the last batch at any point.
58+
Learn more about using the import script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
5659
57-
Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
60+
### _(Optional)_ Generate schema views
61+
62+
After your data is in BigQuery, you can use the schema-views script (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.
63+
64+
Learn more about using the schema-views script to [generate schema views](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md).
5865
5966
### Monitoring
6067

firestore-bigquery-export/PREINSTALL.md

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -16,11 +16,15 @@ Before installing this extension, you'll need to:
1616
+ [Set up Cloud Firestore in your Firebase project.](https://firebase.google.com/docs/firestore/quickstart)
1717
+ [Link your Firebase project to BigQuery.](https://support.google.com/firebase/answer/6318765)
1818

19-
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.
19+
#### Backfill your BigQuery dataset
2020

21-
**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
21+
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) provided by this extension.
2222

23-
Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
23+
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
24+
25+
#### Generate schema views
26+
27+
After your data is in BigQuery, you can run the [schema-views script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md) (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.
2428

2529
#### Billing
2630

firestore-bigquery-export/README.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -22,11 +22,15 @@ Before installing this extension, you'll need to:
2222
+ [Set up Cloud Firestore in your Firebase project.](https://firebase.google.com/docs/firestore/quickstart)
2323
+ [Link your Firebase project to BigQuery.](https://support.google.com/firebase/answer/6318765)
2424

25-
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the import script provided by this extension.
25+
#### Backfill your BigQuery dataset
2626

27-
**Important:** Run the script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
27+
This extension only sends the content of documents that have been changed -- it does not export your full dataset of existing documents into BigQuery. So, to backfill your BigQuery dataset with all the documents in your collection, you can run the [import script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md) provided by this extension.
2828

29-
Learn more about using this script to [backfill your existing collection](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/IMPORT_EXISTING_DOCUMENTS.md).
29+
**Important:** Run the import script over the entire collection _after_ installing this extension, otherwise all writes to your database during the import might be lost.
30+
31+
#### Generate schema views
32+
33+
After your data is in BigQuery, you can run the [schema-views script](https://github.com/firebase/extensions/blob/master/firestore-bigquery-export/guides/GENERATE_SCHEMA_VIEWS.md) (provided by this extension) to create views that make it easier to query relevant data. You only need to provide a JSON schema file that describes your data structure, and the schema-views script will create the views.
3034

3135
#### Billing
3236

@@ -43,7 +47,7 @@ When you use Firebase Extensions, you're only charged for the underlying resourc
4347

4448
**Configuration Parameters:**
4549

46-
* Deployment location: Where should the extension be deployed? You usually want a location close to your database. For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations).
50+
* Cloud Functions location: Where do you want to deploy the functions created for this extension? You usually want a location close to your database. For help selecting a location, refer to the [location selection guide](https://firebase.google.com/docs/functions/locations). Note that this extension locates your BigQuery dataset in `us-central1`.
4751

4852
* Collection path: What is the path of the collection that you would like to export? You may use `{wildcard}` notation to match a subcollection of all documents in a collection (for example: `chatrooms/{chatid}/posts`).
4953

firestore-bigquery-export/extension.yaml

Lines changed: 6 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -58,11 +58,13 @@ resources:
5858
params:
5959
- param: LOCATION
6060
type: select
61-
label: Deployment location
61+
label: Cloud Functions location
6262
description: >-
63-
Where should the extension be deployed? You usually want a location close to your database.
64-
For help selecting a location, refer to the
65-
[location selection guide](https://firebase.google.com/docs/functions/locations).
63+
Where do you want to deploy the functions created for this extension?
64+
You usually want a location close to your database. For help selecting a
65+
location, refer to the [location selection
66+
guide](https://firebase.google.com/docs/functions/locations).
67+
Note that this extension locates your BigQuery dataset in `us-central1`.
6668
options:
6769
- label: Iowa (us-central1)
6870
value: us-central1

0 commit comments

Comments
 (0)