@@ -17,9 +17,7 @@ from the Python programming language.
1717 * [ Install `` setuptools `` ] ( #install-setuptools )
1818 * [ Getting the Dataflow software] ( #getting-the-dataflow-software )
1919 * [ Create and activate virtual environment] ( #create-and-activate-virtual-environment )
20- * [ Download] ( #download )
21- * [ Install] ( #install )
22- * [ Test] ( #test )
20+ * [ Download and install] ( #download-and-install )
2321 * [ Local execution of a pipeline] ( #local-execution-of-a-pipeline )
2422 * [ A Quick Tour of the Source Code] ( #a-quick-tour-of-the-source-code )
2523 * [ Some Simple Examples] ( #some-simple-examples )
@@ -159,34 +157,28 @@ environment's directories. To activate a virtual environment in Bash:
159157That is, source the script ` bin/activate ` under the virtual environment
160158directory you created.
161159
162- #### Download
160+ #### Download and install
163161
164- Clone the SDK from GitHub:
162+ Install the latest tarball from GitHub by browsing to
163+ < https://github.com/GoogleCloudPlatform/DataflowPythonSDK/releases/latest >
164+ and copying one of the "Source code" links. The ` .tar.gz ` file is smaller;
165+ we'll assume you use that one. With a virtual environment active, paste the
166+ URL into a `` pip install `` shell command, executing something like this:
165167
166- git clone https://github.com/GoogleCloudPlatform/DataflowPythonSDK
167-
168- #### Install
169-
170- With a virtual environment active, install the Dataflow package:
171-
172- cd DataflowPythonSDK
173- python setup.py install
174-
175- #### Test
176-
177- After install, run the tests to make sure everything is okay.
178-
179- python setup.py test
168+ ``` sh
169+ pip install https://github.com/GoogleCloud/DataflowPythonSDK/va.b.c.tar.gz
170+ ```
180171
181172## Local execution of a pipeline
182173
183- The ` google/cloud/dataflow/examples ` subdirectory in the
184- source distribution has many examples large and small.
174+ The ` $VIRTUAL_ENV/lib/python2.7/site-packages/google/cloud/dataflow/examples `
175+ subdirectory (the ` google/cloud/dataflow/examples ` subdirectory in the
176+ source distribution) has many examples large and small.
185177
186178All examples can be run locally by passing the arguments required by the
187179example script. For instance, to run ` wordcount.py ` , try:
188180
189- python google/ cloud/ dataflow/ examples/wordcount.py --output OUTPUT_FILE
181+ python -m google. cloud. dataflow. examples.wordcount --output OUTPUT_FILE
190182
191183## A Quick Tour of the Source Code
192184
0 commit comments