Skip to content
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/rdd-programming-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -207,7 +207,7 @@ In the PySpark shell, a special interpreter-aware SparkContext is already create
variable called `sc`. Making your own SparkContext will not work. You can set which master the
context connects to using the `--master` argument, and you can add Python .zip, .egg or .py files
to the runtime path by passing a comma-separated list to `--py-files`. For third-party Python dependencies,
see [Python Package Management](api/python/user_guide/python_packaging.html). You can also add dependencies
see [Python Package Management](api/python/tutorial/python_packaging.html). You can also add dependencies
(e.g. Spark Packages) to your shell session by supplying a comma-separated list of Maven coordinates
to the `--packages` argument. Any additional repositories where dependencies might exist (e.g. Sonatype)
can be passed to the `--repositories` argument. For example, to run
Expand Down