You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This is a mixed target array configuration. if you want to crawl multiple pages, and some pages need to fail and retry, you can try this way of writing:
748
748
@@ -763,7 +763,7 @@ The res you get will be an array of objects.
763
763
764
764
More configuration options can view [CrawlPageDetailTargetConfig](#CrawlPageDetailTargetConfig).
765
765
766
-
##### 4.Advanced config - CrawlPageAdvancedConfig
766
+
##### Advanced config - CrawlPageAdvancedConfig
767
767
768
768
This is an advanced configuration, targets is a mixed target array configuration. if you want to crawl multiple pages, and the request configuration (proxy, cookies, retry, etc.) does not want to be written repeatedly, if you need an interval, you can try this way of writing:
This is a mixed target array configuration. if you want to crawl multiple data, and some data needs to fail and retry, you can try this way of writing:
905
905
@@ -920,7 +920,7 @@ The res you get will be an array of objects.
920
920
921
921
More configuration options can view [CrawlDataDetailTargetConfig](#CrawlDataDetailTargetConfig).
922
922
923
-
##### 4.Advanced config - CrawlDataAdvancedConfig
923
+
##### Advanced config - CrawlDataAdvancedConfig
924
924
925
925
This is an advanced configuration, targets is a mixed target array configuration. if you want to crawl multiple data, and the request configuration (proxy, cookies, retry, etc.) does not want to be written repeatedly, if you need an interval, you can try this writing method:
This is the detailed target array configuration. if you want to crawl multiple files, and some data needs to be retried after failure, you can try this way of writing:
1047
1046
@@ -1062,7 +1061,7 @@ The res you get will be an array of objects.
1062
1061
1063
1062
More configuration options can view [CrawlFileDetailTargetConfig](#CrawlFileDetailTargetConfig).
1064
1063
1065
-
##### 3.Advanced config CrawlFileAdvancedConfig
1064
+
##### Advanced config CrawlFileAdvancedConfig
1066
1065
1067
1066
This is an advanced configuration, targets is a mixed target array configuration. if you want to crawl multiple data, and the request configuration (storeDir, proxy, retry, etc.) does not want to be written repeatedly, and you need interval time, etc., you can try this way of writing:
0 commit comments