这是indexloc提供的服务,不要输入任何密码
Skip to content

Conversation

@r0mainK
Copy link
Contributor

@r0mainK r0mainK commented May 8, 2018

Since I got no responses, did PR to solve this issue.
TL;DR: The pyspark version we use is marked as 2.2.0 once installed, but 2.2.0.post0 in PyPi, which leads to reinstalling pyspark at each upgrade/install of the engine, even if the correct spark is already present. This will remove that behavior.

Signed-off-by: Romain Keramitas <r.keramitas@gmail.com>
@ajnavarro ajnavarro merged commit 6782e03 into src-d:master May 9, 2018
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants