bahir.git
6 years ago[BAHIR-48] Documentation for project README. 20/head
Prashant Sharma [Wed, 10 Aug 2016 06:53:49 +0000 (12:23 +0530)] 
[BAHIR-48] Documentation for project README.

6 years ago[BAHIR-42] Refactor sql-streaming-mqtt scala example
Luciano Resende [Sat, 6 Aug 2016 17:24:01 +0000 (20:24 +0300)] 
[BAHIR-42] Refactor sql-streaming-mqtt scala example

6 years ago[maven-release-plugin] prepare for next development iteration
Luciano Resende [Sat, 6 Aug 2016 09:56:17 +0000 (12:56 +0300)] 
[maven-release-plugin] prepare for next development iteration

6 years ago[maven-release-plugin] prepare release v2.0.0-rc1 v2.0.0 v2.0.0-rc1
Luciano Resende [Sat, 6 Aug 2016 09:55:58 +0000 (12:55 +0300)] 
[maven-release-plugin] prepare release v2.0.0-rc1

6 years ago[BAHIR-44] Add new sql-streaming-mqtt to distribution profile
Luciano Resende [Sat, 6 Aug 2016 09:51:24 +0000 (12:51 +0300)] 
[BAHIR-44] Add new sql-streaming-mqtt to distribution profile

6 years ago[BAHIR-42] Refactor sql-streaming-mqtt example
Luciano Resende [Sat, 6 Aug 2016 09:13:05 +0000 (12:13 +0300)] 
[BAHIR-42] Refactor sql-streaming-mqtt example

Move JavaMQTTStreamWordCount to examples root folder
which are processed by the build as test resources
and not built into the extension itself following
the pattern used by other examples.

6 years ago[BAHIR-43] Add Apache License header file
Luciano Resende [Sat, 6 Aug 2016 09:11:34 +0000 (12:11 +0300)] 
[BAHIR-43] Add Apache License header file

6 years ago[BAHIR-39] Add SQL Streaming MQTT support
Prashant Sharma [Tue, 26 Jul 2016 08:17:15 +0000 (13:47 +0530)] 
[BAHIR-39] Add SQL Streaming MQTT support

This provides support for using MQTT sources for
the new Spark Structured Streaming. This uses
MQTT client persistence layer to provide minimal
fault tolerance.

Closes #13

6 years ago[BAHIR-31] Add basic documentation for ZeroMQ connector
Luciano Resende [Mon, 1 Aug 2016 16:21:24 +0000 (19:21 +0300)] 
[BAHIR-31] Add basic documentation for ZeroMQ connector

6 years ago[BAHIR-30] Add basic documentation for Twitter connector
Luciano Resende [Mon, 1 Aug 2016 16:20:20 +0000 (19:20 +0300)] 
[BAHIR-30] Add basic documentation for Twitter connector

6 years ago[BAHIR-29] Add basic documentation for MQTT Connector
Luciano Resende [Mon, 1 Aug 2016 16:18:35 +0000 (19:18 +0300)] 
[BAHIR-29] Add basic documentation for MQTT Connector

6 years ago[BAHIR-28] Add basic documentation for Akka connector
Luciano Resende [Mon, 1 Aug 2016 16:17:02 +0000 (19:17 +0300)] 
[BAHIR-28] Add basic documentation for Akka connector

6 years ago[BAHIR-38] clean Ivy cache during Maven install phase
Christian Kadner [Wed, 27 Jul 2016 03:37:22 +0000 (20:37 -0700)] 
[BAHIR-38] clean Ivy cache during Maven install phase

When we install the org.apache.bahir jars into the local
Maven repository we also need to clean the previous jar
files from the Ivy cache (~/iv2/cache/org.apache.bahir/*)
so spark-submit -packages ... will pick up the new version
from the the local Maven repository.

Closes #14

6 years ago[BAHIR-37] Update Spark to release 2.0.0
Luciano Resende [Tue, 26 Jul 2016 16:57:32 +0000 (17:57 +0100)] 
[BAHIR-37] Update Spark to release 2.0.0

6 years ago[BAHIR-35] Add Python sources to binary jar
Christian Kadner [Fri, 22 Jul 2016 06:43:49 +0000 (23:43 -0700)] 
[BAHIR-35] Add Python sources to binary jar

Add python sources to jar to enable `spark-submit --packages …`

This can be verified by the following steps :

 mvn clean install

 rm -rf ~/.ivy2/cache/org.apache.bahir/

 mosquitto -p 1883

 bin/run-example \
    org.apache.spark.examples.streaming.mqtt.MQTTPublisher \
    tcp://localhost:1883 \
    foo

 ${SPARK_HOME}/bin/spark-submit \
    --packages org.apache.bahir:spark-streaming-mqtt_2.11:2.0.0-SNAPSHOT \
    streaming-mqtt/examples/src/main/python/streaming/mqtt_wordcount.py \
    tcp://localhost:1883 \
    foo

Closes #11

6 years ago[BAHIR-24] fix MQTT Python code, examples, add tests
Christian Kadner [Sat, 16 Jul 2016 00:49:40 +0000 (17:49 -0700)] 
[BAHIR-24] fix MQTT Python code, examples, add tests

Changes in this PR:

- remove unnecessary files from streaming-mqtt/python
- updated all *.py files with respect to the modified
  project structure pyspark.streaming.mqtt --> mqtt
- add test cases that were left out from the import and
  add shell script to run them:
    - streaming-mqtt/python-tests/run-python-tests.sh
    - streaming-mqtt/python-tests/tests.py
- modify MQTTTestUtils.scala to limit the required disk storage space
- modify bin/run-example script to setup PYTHONPATH to run Python examples

Closes #10

6 years ago[BAHIR-36] Update Readme.md
Luciano Resende [Fri, 22 Jul 2016 20:46:10 +0000 (13:46 -0700)] 
[BAHIR-36] Update Readme.md

- Add how to build and test project

Closes #12

6 years ago[BAHIR-22] add shell script to run examples
Christian Kadner [Tue, 28 Jun 2016 22:07:20 +0000 (15:07 -0700)] 
[BAHIR-22] add shell script to run examples

Apache Spark has a convenience script ./bin/run-example to allow users to
quickly run the pre-packaged examples without having to compose a long(ish)
spark-submit command.
The JavaDoc of most examples refers to that ./bin/run-example script
in their description of how to run that example.

This adds a similar convenience script to the Apache Bahir project in order
to keep consistent with existing (Apache Spark) documentation and to
(at least initially) hide additional complexities of the spark-submit command.

Example:

./bin/run-example \
  org.apache.spark.examples.streaming.akka.ActorWordCount localhost 9999

...translates to this spark-submit command:

${SPARK_HOME}/bin/spark-submit \
  --packages org.apache.bahir:spark-streaming-akka_2.11:2.0.0-SNAPSHOT \
  --class org.apache.spark.examples.streaming.akka.ActorWordCount \
    streaming-akka/target/spark-streaming-akka_2.11-2.0.0-SNAPSHOT-tests.jar \
  localhost 9999

Closes #4

6 years ago[BAHIR-20] Create release helper scripts
Luciano Resende [Sun, 26 Jun 2016 18:24:21 +0000 (11:24 -0700)] 
[BAHIR-20] Create release helper scripts

Release script to automate :
- Preparing release artifacts
- Publishing maven artifacts in Scala 2.10 and 2.11
- Publishing snapshots

6 years ago[MINOR] Update scalastyle-config.xml messages for import style checks
Christian Kadner [Thu, 7 Jul 2016 06:55:12 +0000 (23:55 -0700)] 
[MINOR] Update scalastyle-config.xml messages for import style checks

Adds a few custom messages to the Scala Style configuration.
Short of those, there is no way to determine what the
red squiggly line means in IntelliJ.

Closes #7

6 years ago[BAHIR-23] Build should fail on Checkstyle violations
Christian Kadner [Sat, 9 Jul 2016 06:26:51 +0000 (23:26 -0700)] 
[BAHIR-23] Build should fail on Checkstyle violations

Currently the maven build is configured to:

- fail for code style violations in Scala files
- succeed despite code style violations in Java files
- exclude Scala test sources (and examples) from code style checks
- include Java test sources (and examples) in code style checks

This changes the maven build configuration to

- fail for code style violations in both Scala and Java sources
- include test sources (and examples) in style checks for both
  Scala and Java sources

Additionally cleaning up unsupported checkstyle configuration
elements (apparently copy-and-pasted from scalastyle configuration)

6 years ago[BAHIR-17] Update Apache Spark version back to 2.0.0-SNAPSHOT
Luciano Resende [Sat, 2 Jul 2016 02:23:35 +0000 (19:23 -0700)] 
[BAHIR-17] Update Apache Spark version back to 2.0.0-SNAPSHOT

6 years ago[MINOR] Package names don't correspond to directories structure
Christian Kadner [Wed, 29 Jun 2016 22:47:06 +0000 (15:47 -0700)] 
[MINOR] Package names don't correspond to directories structure

Package names in package.scala files do not correspond to
directories structure for MQTT, Twitter and ZeroMQ modules.
This may cause (future) problems with resolve to classes
from these package.scala files.

Closes #5

6 years ago[maven-release-plugin] prepare for next development iteration
Luciano Resende [Sun, 26 Jun 2016 18:30:43 +0000 (11:30 -0700)] 
[maven-release-plugin] prepare for next development iteration

6 years ago[maven-release-plugin] prepare release 2.0.0-preview-rc1 2.0.0-preview-rc1 v2.0.0-preview
Luciano Resende [Sun, 26 Jun 2016 18:30:29 +0000 (11:30 -0700)] 
[maven-release-plugin] prepare release 2.0.0-preview-rc1

6 years ago[BAHIR-21] Change scala version script
Luciano Resende [Sun, 26 Jun 2016 18:21:08 +0000 (11:21 -0700)] 
[BAHIR-21] Change scala version script

Script to change scala version in use between 2.10 and 2.11
to help during development builds and also when publishing
releases in both scala version

6 years ago[BAHIR-18] Configure examples as maven test sources
Christian Kadner [Thu, 23 Jun 2016 01:40:05 +0000 (18:40 -0700)] 
[BAHIR-18] Configure examples as maven test sources

This PR configure examples as maven test resources to be
recognized by maven builds. This acomplish the following :

- The examples get compiled
- IDEs like IntelliJ or Eclipse recognize the
  <module>/examples/src/[java|scala] as source folders
- Keep the examples along with their additional dependencies
  excluded from the generated binaries

Closes #2

6 years ago[MINOR] Correcting JavaDoc & adding EOF character
Christian Kadner [Sat, 25 Jun 2016 04:08:41 +0000 (21:08 -0700)] 
[MINOR] Correcting JavaDoc & adding EOF character

minor corrections to commit 9110e56

Closes #3

6 years ago[BAHIR-19] Update source distribution assembly name
Luciano Resende [Sun, 26 Jun 2016 07:50:12 +0000 (00:50 -0700)] 
[BAHIR-19] Update source distribution assembly name

Update final assembly name and extraction directory
to use apache best practice pattern :

apache-bahir-${project.version}-src

6 years ago[BAHIR-19] Create source distribution assembly
Luciano Resende [Sun, 26 Jun 2016 06:58:38 +0000 (23:58 -0700)] 
[BAHIR-19] Create source distribution assembly

Add assemblie to create Bahir source release distribution

6 years ago[[BAHIR-14] More parent pom cleanup
Luciano Resende [Fri, 24 Jun 2016 21:57:50 +0000 (14:57 -0700)] 
[[BAHIR-14] More parent pom cleanup

Remove Spark assembly related configuration, and
stop producing source jars for non-jar projects.

6 years ago[BAHIR-17] Disable javadoc generation
Luciano Resende [Sat, 18 Jun 2016 21:13:52 +0000 (14:13 -0700)] 
[BAHIR-17] Disable javadoc generation

6 years ago[MINOR] Add package related files
Luciano Resende [Sat, 18 Jun 2016 21:12:01 +0000 (14:12 -0700)] 
[MINOR] Add package related files

6 years ago[BAHIR-7] Update Apache Spark version to 2.0.0-preview
Luciano Resende [Sat, 18 Jun 2016 20:24:22 +0000 (13:24 -0700)] 
[BAHIR-7] Update Apache Spark version to 2.0.0-preview

6 years ago[BAHIR-16] Add missing log4j.properties
Luciano Resende [Sat, 18 Jun 2016 20:12:12 +0000 (13:12 -0700)] 
[BAHIR-16] Add missing log4j.properties

6 years ago[BAHIR-15] Enable RAT on builds
Luciano Resende [Sat, 18 Jun 2016 20:06:54 +0000 (13:06 -0700)] 
[BAHIR-15] Enable RAT on builds

Enable RAT to run automatically during builds
to verify license header policy.

6 years ago[BAHIR-14] Cleanup Bahir parent pom
Luciano Resende [Sat, 18 Jun 2016 16:53:15 +0000 (09:53 -0700)] 
[BAHIR-14] Cleanup Bahir parent pom

The Bahir parent pom was initially based on Spark parent pom
and was bringing a lot of unecessary dependencies. This commit
cleans most of the unused properties, dependencies, etc.

Closes #1

6 years ago[BAHIR-13] Update dependencies on spark-tags
Luciano Resende [Thu, 16 Jun 2016 05:06:43 +0000 (22:06 -0700)] 
[BAHIR-13] Update dependencies on spark-tags

The spark-test-tags and spark-tags were merged in
revision 8ad9f08c9 and the modules in Bahir needs
to be updated to properly use spark-tags dependency.

6 years ago[BAHIR-2] Initial maven build for Bahir spark extensions
Luciano Resende [Wed, 8 Jun 2016 06:35:17 +0000 (23:35 -0700)] 
[BAHIR-2] Initial maven build for Bahir spark extensions

6 years agoAdd project LICENSE and NOTICE files
Luciano Resende [Fri, 20 May 2016 01:22:18 +0000 (18:22 -0700)] 
Add project LICENSE and NOTICE files

6 years agoAdd git configuration files
Luciano Resende [Fri, 20 May 2016 01:21:47 +0000 (18:21 -0700)] 
Add git configuration files

6 years agoUpdate README.md
Christian Kadner [Tue, 7 Jun 2016 05:40:32 +0000 (22:40 -0700)] 
Update README.md

6 years agoUpdate README.md
Christian Kadner [Tue, 7 Jun 2016 05:36:46 +0000 (22:36 -0700)] 
Update README.md

6 years agoCreate README.md
Christian Kadner [Tue, 7 Jun 2016 05:36:01 +0000 (22:36 -0700)] 
Create README.md

6 years ago[SPARK-13848][SPARK-5185] Update to Py4J 0.9.2 in order to fix classloading issue
Josh Rosen [Mon, 14 Mar 2016 19:22:02 +0000 (12:22 -0700)] 
[SPARK-13848][SPARK-5185] Update to Py4J 0.9.2 in order to fix classloading issue

This patch upgrades Py4J from 0.9.1 to 0.9.2 in order to include a patch which modifies Py4J to use the current thread's ContextClassLoader when performing reflection / class loading. This is necessary in order to fix [SPARK-5185](https://issues.apache.org/jira/browse/SPARK-5185), a longstanding issue affecting the use of `--jars` and `--packages` in PySpark.

In order to demonstrate that the fix works, I removed the workarounds which were added as part of [SPARK-6027](https://issues.apache.org/jira/browse/SPARK-6027) / #4779 and other patches.

Py4J diff: https://github.com/bartdag/py4j/compare/0.9.1...0.9.2

/cc zsxwing tdas davies brkyvz

Author: Josh Rosen <joshrosen@databricks.com>

Closes #11687 from JoshRosen/py4j-0.9.2.

6 years ago[SPARK-13823][CORE][STREAMING][SQL] Always specify Charset in String <-> byte[] conve...
Sean Owen [Mon, 14 Mar 2016 04:03:49 +0000 (21:03 -0700)] 
[SPARK-13823][CORE][STREAMING][SQL] Always specify Charset in String <-> byte[] conversions (and remaining Coverity items)

## What changes were proposed in this pull request?

- Fixes calls to `new String(byte[])` or `String.getBytes()` that rely on platform default encoding, to use UTF-8
- Same for `InputStreamReader` and `OutputStreamWriter` constructors
- Standardizes on UTF-8 everywhere
- Standardizes specifying the encoding with `StandardCharsets.UTF-8`, not the Guava constant or "UTF-8" (which means handling `UnuspportedEncodingException`)
- (also addresses the other remaining Coverity scan issues, which are pretty trivial; these are separated into commit https://github.com/srowen/spark/commit/1deecd8d9ca986d8adb1a42d315890ce5349d29c )

## How was this patch tested?

Jenkins tests

Author: Sean Owen <sowen@cloudera.com>

Closes #11657 from srowen/SPARK-13823.

6 years ago[SPARK-13807] De-duplicate `Python*Helper` instantiation code in PySpark streaming
Josh Rosen [Fri, 11 Mar 2016 19:18:51 +0000 (11:18 -0800)] 
[SPARK-13807] De-duplicate `Python*Helper` instantiation code in PySpark streaming

This patch de-duplicates code in PySpark streaming which loads the `Python*Helper` classes. I also changed a few `raise e` statements to simply `raise` in order to preserve the full exception stacktrace when re-throwing.

Here's a link to the whitespace-change-free diff: https://github.com/apache/spark/compare/master...JoshRosen:pyspark-reflection-deduplication?w=0

Author: Josh Rosen <joshrosen@databricks.com>

Closes #11641 from JoshRosen/pyspark-reflection-deduplication.

6 years ago[SPARK-3854][BUILD] Scala style: require spaces before `{`.
Dongjoon Hyun [Thu, 10 Mar 2016 23:57:22 +0000 (15:57 -0800)] 
[SPARK-3854][BUILD] Scala style: require spaces before `{`.

## What changes were proposed in this pull request?

Since the opening curly brace, '{', has many usages as discussed in [SPARK-3854](https://issues.apache.org/jira/browse/SPARK-3854), this PR adds a ScalaStyle rule to prevent '){' pattern  for the following majority pattern and fixes the code accordingly. If we enforce this in ScalaStyle from now, it will improve the Scala code quality and reduce review time.
```
// Correct:
if (true) {
  println("Wow!")
}

// Incorrect:
if (true){
   println("Wow!")
}
```
IntelliJ also shows new warnings based on this.

## How was this patch tested?

Pass the Jenkins ScalaStyle test.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11637 from dongjoon-hyun/SPARK-3854.

6 years ago[SPARK-13702][CORE][SQL][MLLIB] Use diamond operator for generic instance creation...
Dongjoon Hyun [Wed, 9 Mar 2016 10:31:26 +0000 (10:31 +0000)] 
[SPARK-13702][CORE][SQL][MLLIB] Use diamond operator for generic instance creation in Java code.

## What changes were proposed in this pull request?

In order to make `docs/examples` (and other related code) more simple/readable/user-friendly, this PR replaces existing codes like the followings by using `diamond` operator.

```
-    final ArrayList<Product2<Object, Object>> dataToWrite =
-      new ArrayList<Product2<Object, Object>>();
+    final ArrayList<Product2<Object, Object>> dataToWrite = new ArrayList<>();
```

Java 7 or higher supports **diamond** operator which replaces the type arguments required to invoke the constructor of a generic class with an empty set of type parameters (<>). Currently, Spark Java code use mixed usage of this.

## How was this patch tested?

Manual.
Pass the existing tests.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11541 from dongjoon-hyun/SPARK-13702.

6 years agoFixing the type of the sentiment happiness value
Yury Liavitski [Mon, 7 Mar 2016 10:54:33 +0000 (10:54 +0000)] 
Fixing the type of the sentiment happiness value

## What changes were proposed in this pull request?

Added the conversion to int for the 'happiness value' read from the file. Otherwise, later on line 75 the multiplication will multiply a string by a number, yielding values like "-2-2" instead of -4.

## How was this patch tested?

Tested manually.

Author: Yury Liavitski <seconds.before@gmail.com>
Author: Yury Liavitski <yury.liavitski@il111.ice.local>

Closes #11540 from heliocentrist/fix-sentiment-value-type.

6 years ago[MINOR] Fix typos in comments and testcase name of code
Dongjoon Hyun [Thu, 3 Mar 2016 22:42:12 +0000 (22:42 +0000)] 
[MINOR] Fix typos in comments and testcase name of code

## What changes were proposed in this pull request?

This PR fixes typos in comments and testcase name of code.

## How was this patch tested?

manual.

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11481 from dongjoon-hyun/minor_fix_typos_in_code.

6 years ago[SPARK-13583][CORE][STREAMING] Remove unused imports and add checkstyle rule
Dongjoon Hyun [Thu, 3 Mar 2016 10:12:32 +0000 (10:12 +0000)] 
[SPARK-13583][CORE][STREAMING] Remove unused imports and add checkstyle rule

## What changes were proposed in this pull request?

After SPARK-6990, `dev/lint-java` keeps Java code healthy and helps PR review by saving much time.
This issue aims remove unused imports from Java/Scala code and add `UnusedImports` checkstyle rule to help developers.

## How was this patch tested?
```
./dev/lint-java
./build/sbt compile
```

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11438 from dongjoon-hyun/SPARK-13583.

6 years ago[SPARK-13423][WIP][CORE][SQL][STREAMING] Static analysis fixes for 2.x
Sean Owen [Thu, 3 Mar 2016 09:54:09 +0000 (09:54 +0000)] 
[SPARK-13423][WIP][CORE][SQL][STREAMING] Static analysis fixes for 2.x

## What changes were proposed in this pull request?

Make some cross-cutting code improvements according to static analysis. These are individually up for discussion since they exist in separate commits that can be reverted. The changes are broadly:

- Inner class should be static
- Mismatched hashCode/equals
- Overflow in compareTo
- Unchecked warnings
- Misuse of assert, vs junit.assert
- get(a) + getOrElse(b) -> getOrElse(a,b)
- Array/String .size -> .length (occasionally, -> .isEmpty / .nonEmpty) to avoid implicit conversions
- Dead code
- tailrec
- exists(_ == ) -> contains find + nonEmpty -> exists filter + size -> count
- reduce(_+_) -> sum map + flatten -> map

The most controversial may be .size -> .length simply because of its size. It is intended to avoid implicits that might be expensive in some places.

## How was the this patch tested?

Existing Jenkins unit tests.

Author: Sean Owen <sowen@cloudera.com>

Closes #11292 from srowen/SPARK-13423.

6 years ago[MINOR][STREAMING] Replace deprecated `apply` with `create` in example.
Dongjoon Hyun [Wed, 2 Mar 2016 11:48:23 +0000 (11:48 +0000)] 
[MINOR][STREAMING] Replace deprecated `apply` with `create` in example.

## What changes were proposed in this pull request?

Twitter Algebird deprecated `apply` in HyperLogLog.scala.
```
deprecated("Use toHLL", since = "0.10.0 / 2015-05")
def apply[T <% Array[Byte]](t: T) = create(t)
```
This PR replace the deprecated usage `apply` with new `create`
according to the upstream change.

## How was this patch tested?
manual.
```
/bin/spark-submit --class org.apache.spark.examples.streaming.TwitterAlgebirdHLL examples/target/scala-2.11/spark-examples-2.0.0-SNAPSHOT-hadoop2.2.0.jar
```

Author: Dongjoon Hyun <dongjoon@apache.org>

Closes #11451 from dongjoon-hyun/replace_deprecated_hll_apply.

6 years ago[SPARK-13069][STREAMING] Add "ask" style store() to ActorReciever
Lin Zhao [Thu, 25 Feb 2016 20:32:17 +0000 (12:32 -0800)] 
[SPARK-13069][STREAMING] Add "ask" style store() to ActorReciever

Introduces a "ask" style ```store``` in ```ActorReceiver``` as a way to allow actor receiver blocked by back pressure or maxRate.

Author: Lin Zhao <lin@exabeam.com>

Closes #11176 from lin-zhao/SPARK-13069.

6 years ago[SPARK-13339][DOCS] Clarify commutative / associative operator requirements for reduc...
Sean Owen [Fri, 19 Feb 2016 10:26:38 +0000 (10:26 +0000)] 
[SPARK-13339][DOCS] Clarify commutative / associative operator requirements for reduce, fold

Clarify that reduce functions need to be commutative, and fold functions do not

See https://github.com/apache/spark/pull/11091

Author: Sean Owen <sowen@cloudera.com>

Closes #11217 from srowen/SPARK-13339.

6 years ago[SPARK-13177][EXAMPLES] Update ActorWordCount example to not directly use low level...
sachin aggarwal [Tue, 9 Feb 2016 08:52:58 +0000 (08:52 +0000)] 
[SPARK-13177][EXAMPLES] Update ActorWordCount example to not directly use low level linked list as it is deprecated.

Author: sachin aggarwal <different.sachin@gmail.com>

Closes #11113 from agsachin/master.

6 years ago[SPARK-6363][BUILD] Make Scala 2.11 the default Scala version
Josh Rosen [Sat, 30 Jan 2016 08:20:28 +0000 (00:20 -0800)] 
[SPARK-6363][BUILD] Make Scala 2.11 the default Scala version

This patch changes Spark's build to make Scala 2.11 the default Scala version. To be clear, this does not mean that Spark will stop supporting Scala 2.10: users will still be able to compile Spark for Scala 2.10 by following the instructions on the "Building Spark" page; however, it does mean that Scala 2.11 will be the default Scala version used by our CI builds (including pull request builds).

The Scala 2.11 compiler is faster than 2.10, so I think we'll be able to look forward to a slight speedup in our CI builds (it looks like it's about 2X faster for the Maven compile-only builds, for instance).

After this patch is merged, I'll update Jenkins to add new compile-only jobs to ensure that Scala 2.10 compilation doesn't break.

Author: Josh Rosen <joshrosen@databricks.com>

Closes #10608 from JoshRosen/SPARK-6363.

6 years ago[SPARK-3369][CORE][STREAMING] Java mapPartitions Iterator->Iterable is inconsistent...
Sean Owen [Tue, 26 Jan 2016 11:55:28 +0000 (11:55 +0000)] 
[SPARK-3369][CORE][STREAMING] Java mapPartitions Iterator->Iterable is inconsistent with Scala's Iterator->Iterator

Fix Java function API methods for flatMap and mapPartitions to require producing only an Iterator, not Iterable. Also fix DStream.flatMap to require a function producing TraversableOnce only, not Traversable.

CC rxin pwendell for API change; tdas since it also touches streaming.

Author: Sean Owen <sowen@cloudera.com>

Closes #10413 from srowen/SPARK-3369.

6 years ago[HOTFIX][BUILD][TEST-MAVEN] Remove duplicate dependency
Shixiong Zhu [Fri, 22 Jan 2016 20:33:18 +0000 (12:33 -0800)] 
[HOTFIX][BUILD][TEST-MAVEN] Remove duplicate dependency

Author: Shixiong Zhu <shixiong@databricks.com>

Closes #10868 from zsxwing/hotfix-akka-pom.

6 years ago[SPARK-7799][SPARK-12786][STREAMING] Add "streaming-akka" project
Shixiong Zhu [Wed, 20 Jan 2016 21:55:41 +0000 (13:55 -0800)] 
[SPARK-7799][SPARK-12786][STREAMING] Add "streaming-akka" project

Include the following changes:

1. Add "streaming-akka" project and org.apache.spark.streaming.akka.AkkaUtils for creating an actorStream
2. Remove "StreamingContext.actorStream" and "JavaStreamingContext.actorStream"
3. Update the ActorWordCount example and add the JavaActorWordCount example
4. Make "streaming-zeromq" depend on "streaming-akka" and update the codes accordingly

Author: Shixiong Zhu <shixiong@databricks.com>

Closes #10744 from zsxwing/streaming-akka-2.

6 years ago[SPARK-12692][BUILD][STREAMING] Scala style: Fix the style violation (Space before...
Kousuke Saruta [Tue, 12 Jan 2016 05:06:22 +0000 (21:06 -0800)] 
[SPARK-12692][BUILD][STREAMING] Scala style: Fix the style violation (Space before "," or ":")

Fix the style violation (space before , and :).
This PR is a followup for #10643.

Author: Kousuke Saruta <sarutak@oss.nttdata.co.jp>

Closes #10685 from sarutak/SPARK-12692-followup-streaming.

6 years ago[SPARK-4628][BUILD] Remove all non-Maven-Central repositories from build
Josh Rosen [Sat, 9 Jan 2016 04:58:53 +0000 (20:58 -0800)] 
[SPARK-4628][BUILD] Remove all non-Maven-Central repositories from build

This patch removes all non-Maven-central repositories from Spark's build, thereby avoiding any risk of future build-breaks due to us accidentally depending on an artifact which is not present in an immutable public Maven repository.

I tested this by running

```
build/mvn \
        -Phive \
        -Phive-thriftserver \
        -Pkinesis-asl \
        -Pspark-ganglia-lgpl \
        -Pyarn \
        dependency:go-offline
```

inside of a fresh Ubuntu Docker container with no Ivy or Maven caches (I did a similar test for SBT).

Author: Josh Rosen <joshrosen@databricks.com>

Closes #10659 from JoshRosen/SPARK-4628.

6 years ago[SPARK-12618][CORE][STREAMING][SQL] Clean up build warnings: 2.0.0 edition
Sean Owen [Fri, 8 Jan 2016 17:47:44 +0000 (17:47 +0000)] 
[SPARK-12618][CORE][STREAMING][SQL] Clean up build warnings: 2.0.0 edition

Fix most build warnings: mostly deprecated API usages. I'll annotate some of the changes below. CC rxin who is leading the charge to remove the deprecated APIs.

Author: Sean Owen <sowen@cloudera.com>

Closes #10570 from srowen/SPARK-12618.

6 years ago[SPARK-12510][STREAMING] Refactor ActorReceiver to support Java
Shixiong Zhu [Thu, 7 Jan 2016 23:26:55 +0000 (15:26 -0800)] 
[SPARK-12510][STREAMING] Refactor ActorReceiver to support Java

This PR includes the following changes:

1. Rename `ActorReceiver` to `ActorReceiverSupervisor`
2. Remove `ActorHelper`
3. Add a new `ActorReceiver` for Scala and `JavaActorReceiver` for Java
4. Add `JavaActorWordCount` example

Author: Shixiong Zhu <shixiong@databricks.com>

Closes #10457 from zsxwing/java-actor-stream.

6 years ago[STREAMING][DOCS][EXAMPLES] Minor fixes
Jacek Laskowski [Thu, 7 Jan 2016 08:27:13 +0000 (00:27 -0800)] 
[STREAMING][DOCS][EXAMPLES] Minor fixes

Author: Jacek Laskowski <jacek@japila.pl>

Closes #10603 from jaceklaskowski/streaming-actor-custom-receiver.

6 years ago[SPARK-3873][TESTS] Import ordering fixes.
Marcelo Vanzin [Wed, 6 Jan 2016 03:07:39 +0000 (19:07 -0800)] 
[SPARK-3873][TESTS] Import ordering fixes.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #10582 from vanzin/SPARK-3873-tests.

6 years ago[SPARK-3873][EXAMPLES] Import ordering fixes.
Marcelo Vanzin [Tue, 5 Jan 2016 06:42:54 +0000 (22:42 -0800)] 
[SPARK-3873][EXAMPLES] Import ordering fixes.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #10575 from vanzin/SPARK-3873-examples.

6 years ago[SPARK-3873][STREAMING] Import order fixes for streaming.
Marcelo Vanzin [Thu, 31 Dec 2015 09:34:13 +0000 (01:34 -0800)] 
[SPARK-3873][STREAMING] Import order fixes for streaming.

Also included a few miscelaneous other modules that had very few violations.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #10532 from vanzin/SPARK-3873-streaming.

6 years ago[SPARK-12353][STREAMING][PYSPARK] Fix countByValue inconsistent output in Python API
jerryshao [Mon, 28 Dec 2015 10:43:23 +0000 (10:43 +0000)] 
[SPARK-12353][STREAMING][PYSPARK] Fix countByValue inconsistent output in Python API

The semantics of Python countByValue is different from Scala API, it is more like countDistinctValue, so here change to make it consistent with Scala/Java API.

Author: jerryshao <sshao@hortonworks.com>

Closes #10350 from jerryshao/SPARK-12353.

6 years agoBump master version to 2.0.0-SNAPSHOT.
Reynold Xin [Sat, 19 Dec 2015 23:13:05 +0000 (15:13 -0800)] 
Bump master version to 2.0.0-SNAPSHOT.

Author: Reynold Xin <rxin@databricks.com>

Closes #10387 from rxin/version-bump.

6 years ago[SPARK-12091] [PYSPARK] Deprecate the JAVA-specific deserialized storage levels
gatorsmile [Sat, 19 Dec 2015 04:06:05 +0000 (20:06 -0800)] 
[SPARK-12091] [PYSPARK] Deprecate the JAVA-specific deserialized storage levels

The current default storage level of Python persist API is MEMORY_ONLY_SER. This is different from the default level MEMORY_ONLY in the official document and RDD APIs.

davies Is this inconsistency intentional? Thanks!

Updates: Since the data is always serialized on the Python side, the storage levels of JAVA-specific deserialization are not removed, such as MEMORY_ONLY.

Updates: Based on the reviewers' feedback. In Python, stored objects will always be serialized with the [Pickle](https://docs.python.org/2/library/pickle.html) library, so it does not matter whether you choose a serialized level. The available storage levels in Python include `MEMORY_ONLY`, `MEMORY_ONLY_2`, `MEMORY_AND_DISK`, `MEMORY_AND_DISK_2`, `DISK_ONLY`, `DISK_ONLY_2` and `OFF_HEAP`.

Author: gatorsmile <gatorsmile@gmail.com>

Closes #10092 from gatorsmile/persistStorageLevel.

6 years ago[SPARK-9057][STREAMING] Twitter example joining to static RDD of word sentiment values
Jeff L [Fri, 18 Dec 2015 15:06:54 +0000 (15:06 +0000)] 
[SPARK-9057][STREAMING] Twitter example joining to static RDD of word sentiment values

Example of joining a static RDD of word sentiments to a streaming RDD of Tweets in order to demo the usage of the transform() method.

Author: Jeff L <sha0lin@alumni.carnegiemellon.edu>

Closes #8431 from Agent007/SPARK-9057.

6 years ago[SPARK-11904][PYSPARK] reduceByKeyAndWindow does not require checkpointing when invFu...
David Tolpin [Thu, 17 Dec 2015 06:10:24 +0000 (22:10 -0800)] 
[SPARK-11904][PYSPARK] reduceByKeyAndWindow does not require checkpointing when invFunc is None

when invFunc is None, `reduceByKeyAndWindow(func, None, winsize, slidesize)` is equivalent to

     reduceByKey(func).window(winsize, slidesize).reduceByKey(winsize, slidesize)

and no checkpoint is necessary. The corresponding Scala code does exactly that, but Python code always creates a windowed stream with obligatory checkpointing. The patch fixes this.

I do not know how to unit-test this.

Author: David Tolpin <david.tolpin@gmail.com>

Closes #9888 from dtolpin/master.

6 years ago[SPARK-11713] [PYSPARK] [STREAMING] Initial RDD updateStateByKey for PySpark
Bryan Cutler [Thu, 10 Dec 2015 22:21:15 +0000 (14:21 -0800)] 
[SPARK-11713] [PYSPARK] [STREAMING] Initial RDD updateStateByKey for PySpark

Adding ability to define an initial state RDD for use with updateStateByKey PySpark.  Added unit test and changed stateful_network_wordcount example to use initial RDD.

Author: Bryan Cutler <bjcutler@us.ibm.com>

Closes #10082 from BryanCutler/initial-rdd-updateStateByKey-SPARK-11713.

6 years ago[SPARK-12023][BUILD] Fix warnings while packaging spark with maven.
Prashant Sharma [Mon, 30 Nov 2015 10:11:27 +0000 (10:11 +0000)] 
[SPARK-12023][BUILD] Fix warnings while packaging spark with maven.

this is a trivial fix, discussed [here](http://stackoverflow.com/questions/28500401/maven-assembly-plugin-warning-the-assembly-descriptor-contains-a-filesystem-roo/).

Author: Prashant Sharma <scrapcodes@gmail.com>

Closes #10014 from ScrapCodes/assembly-warning.

6 years ago[SPARK-11812][PYSPARK] invFunc=None works properly with python's reduceByKeyAndWindow
David Tolpin [Thu, 19 Nov 2015 21:57:23 +0000 (13:57 -0800)] 
[SPARK-11812][PYSPARK] invFunc=None works properly with python's reduceByKeyAndWindow

invFunc is optional and can be None. Instead of invFunc (the parameter) invReduceFunc (a local function) was checked for trueness (that is, not None, in this context). A local function is never None,
thus the case of invFunc=None (a common one when inverse reduction is not defined) was treated incorrectly, resulting in loss of data.

In addition, the docstring used wrong parameter names, also fixed.

Author: David Tolpin <david.tolpin@gmail.com>

Closes #9775 from dtolpin/master.

6 years ago[SPARK-6328][PYTHON] Python API for StreamingListener
Daniel Jalova [Mon, 16 Nov 2015 19:29:27 +0000 (11:29 -0800)] 
[SPARK-6328][PYTHON] Python API for StreamingListener

Author: Daniel Jalova <djalova@us.ibm.com>

Closes #9186 from djalova/SPARK-6328.

6 years ago[SPARK-11245] update twitter4j to 4.0.4 version
dima [Sat, 24 Oct 2015 17:16:45 +0000 (18:16 +0100)] 
[SPARK-11245] update twitter4j to 4.0.4 version

update twitter4j to 4.0.4 version
https://issues.apache.org/jira/browse/SPARK-11245

Author: dima <pronix.service@gmail.com>

Closes #9221 from pronix/twitter4j_update.

6 years ago[SPARK-10447][SPARK-3842][PYSPARK] upgrade pyspark to py4j0.9
Holden Karau [Tue, 20 Oct 2015 17:52:49 +0000 (10:52 -0700)] 
[SPARK-10447][SPARK-3842][PYSPARK] upgrade pyspark to py4j0.9

Upgrade to Py4j0.9

Author: Holden Karau <holden@pigscanfly.ca>
Author: Holden Karau <holden@us.ibm.com>

Closes #8615 from holdenk/SPARK-10447-upgrade-pyspark-to-py4j0.9.

6 years ago[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py.
Marcelo Vanzin [Wed, 7 Oct 2015 21:11:21 +0000 (14:11 -0700)] 
[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #8775 from vanzin/SPARK-10300.

7 years ago[DOC] [PYSPARK] [MLLIB] Added newlines to docstrings to fix parameter formatting
noelsmith [Mon, 21 Sep 2015 21:24:19 +0000 (14:24 -0700)] 
[DOC] [PYSPARK] [MLLIB] Added newlines to docstrings to fix parameter formatting

Added newlines before `:param ...:` and `:return:` markup. Without these, parameter lists aren't formatted correctly in the API docs. I.e:

![screen shot 2015-09-21 at 21 49 26](https://cloud.githubusercontent.com/assets/11915197/10004686/de3c41d4-60aa-11e5-9c50-a46dcb51243f.png)

.. looks like this once newline is added:

![screen shot 2015-09-21 at 21 50 14](https://cloud.githubusercontent.com/assets/11915197/10004706/f86bfb08-60aa-11e5-8524-ae4436713502.png)

Author: noelsmith <mail@noelsmith.com>

Closes #8851 from noel-smith/docstring-missing-newline-fix.

7 years agoRevert "[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py."
Marcelo Vanzin [Tue, 15 Sep 2015 20:03:38 +0000 (13:03 -0700)] 
Revert "[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py."

This reverts commit 8abef21dac1a6538c4e4e0140323b83d804d602b.

7 years ago[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py.
Marcelo Vanzin [Tue, 15 Sep 2015 17:45:02 +0000 (10:45 -0700)] 
[SPARK-10300] [BUILD] [TESTS] Add support for test tags in run-tests.py.

This change does two things:

- tag a few tests and adds the mechanism in the build to be able to disable those tags,
  both in maven and sbt, for both junit and scalatest suites.
- add some logic to run-tests.py to disable some tags depending on what files have
  changed; that's used to disable expensive tests when a module hasn't explicitly
  been changed, to speed up testing for changes that don't directly affect those
  modules.

Author: Marcelo Vanzin <vanzin@cloudera.com>

Closes #8437 from vanzin/test-tags.

7 years agoUpdate version to 1.6.0-SNAPSHOT.
Reynold Xin [Tue, 15 Sep 2015 07:54:20 +0000 (00:54 -0700)] 
Update version to 1.6.0-SNAPSHOT.

Author: Reynold Xin <rxin@databricks.com>

Closes #8350 from rxin/1.6.

7 years ago[SPARK-10547] [TEST] Streamline / improve style of Java API tests
Sean Owen [Sat, 12 Sep 2015 09:40:10 +0000 (10:40 +0100)] 
[SPARK-10547] [TEST] Streamline / improve style of Java API tests

Fix a few Java API test style issues: unused generic types, exceptions, wrong assert argument order

Author: Sean Owen <sowen@cloudera.com>

Closes #8706 from srowen/SPARK-10547.

7 years ago[SPARK-10227] fatal warnings with sbt on Scala 2.11
Luc Bourlier [Wed, 9 Sep 2015 08:57:58 +0000 (09:57 +0100)] 
[SPARK-10227] fatal warnings with sbt on Scala 2.11

The bulk of the changes are on `transient` annotation on class parameter. Often the compiler doesn't generate a field for this parameters, so the the transient annotation would be unnecessary.
But if the class parameter are used in methods, then fields are created. So it is safer to keep the annotations.

The remainder are some potential bugs, and deprecated syntax.

Author: Luc Bourlier <luc.bourlier@typesafe.com>

Closes #8433 from skyluc/issue/sbt-2.11.

7 years ago[SPARK-9613] [CORE] Ban use of JavaConversions and migrate all existing uses to JavaC...
Sean Owen [Tue, 25 Aug 2015 11:33:13 +0000 (12:33 +0100)] 
[SPARK-9613] [CORE] Ban use of JavaConversions and migrate all existing uses to JavaConverters

Replace `JavaConversions` implicits with `JavaConverters`

Most occurrences I've seen so far are necessary conversions; a few have been avoidable. None are in critical code as far as I see, yet.

Author: Sean Owen <sowen@cloudera.com>

Closes #8033 from srowen/SPARK-9613.

7 years ago[SPARK-9791] [PACKAGE] Change private class to private class to prevent unnecessary...
Tathagata Das [Mon, 24 Aug 2015 19:40:09 +0000 (12:40 -0700)] 
[SPARK-9791] [PACKAGE] Change private class to private class to prevent unnecessary classes from showing up in the docs

In addition, some random cleanup of import ordering

Author: Tathagata Das <tathagata.das1565@gmail.com>

Closes #8387 from tdas/SPARK-9791 and squashes the following commits:

67f3ee9 [Tathagata Das] Change private class to private[package] class to prevent them from showing up in the docs

7 years ago[SPARK-10122] [PYSPARK] [STREAMING] Fix getOffsetRanges bug in PySpark-Streaming...
jerryshao [Fri, 21 Aug 2015 20:10:11 +0000 (13:10 -0700)] 
[SPARK-10122] [PYSPARK] [STREAMING] Fix getOffsetRanges bug in PySpark-Streaming transform function

Details of the bug and explanations can be seen in [SPARK-10122](https://issues.apache.org/jira/browse/SPARK-10122).

tdas , please help to review.

Author: jerryshao <sshao@hortonworks.com>

Closes #8347 from jerryshao/SPARK-10122 and squashes the following commits:

4039b16 [jerryshao] Fix getOffsetRanges in transform() bug

7 years ago[SPARK-9812] [STREAMING] Fix Python 3 compatibility issue in PySpark Streaming and...
zsxwing [Thu, 20 Aug 2015 01:36:01 +0000 (18:36 -0700)] 
[SPARK-9812] [STREAMING] Fix Python 3 compatibility issue in PySpark Streaming and some docs

This PR includes the following fixes:
1. Use `range` instead of `xrange` in `queue_stream.py` to support Python 3.
2. Fix the issue that `utf8_decoder` will return `bytes` rather than `str` when receiving an empty `bytes` in Python 3.
3. Fix the commands in docs so that the user can copy them directly to the command line. The previous commands was broken in the middle of a path, so when copying to the command line, the path would be split to two parts by the extra spaces, which forces the user to fix it manually.

Author: zsxwing <zsxwing@gmail.com>

Closes #8315 from zsxwing/SPARK-9812.

7 years ago[SPARK-5155] [PYSPARK] [STREAMING] Mqtt streaming support in Python
Prabeesh K [Mon, 10 Aug 2015 23:33:23 +0000 (16:33 -0700)] 
[SPARK-5155] [PYSPARK] [STREAMING] Mqtt streaming support in Python

This PR is based on #4229, thanks prabeesh.

Closes #4229

Author: Prabeesh K <prabsmails@gmail.com>
Author: zsxwing <zsxwing@gmail.com>
Author: prabs <prabsmails@gmail.com>
Author: Prabeesh K <prabeesh.k@namshi.com>

Closes #7833 from zsxwing/pr4229 and squashes the following commits:

9570bec [zsxwing] Fix the variable name and check null in finally
4a9c79e [zsxwing] Fix pom.xml indentation
abf5f18 [zsxwing] Merge branch 'master' into pr4229
935615c [zsxwing] Fix the flaky MQTT tests
47278c5 [zsxwing] Include the project class files
478f844 [zsxwing] Add unpack
5f8a1d4 [zsxwing] Make the maven build generate the test jar for Python MQTT tests
734db99 [zsxwing] Merge branch 'master' into pr4229
126608a [Prabeesh K] address the comments
b90b709 [Prabeesh K] Merge pull request #1 from zsxwing/pr4229
d07f454 [zsxwing] Register StreamingListerner before starting StreamingContext; Revert unncessary changes; fix the python unit test
a6747cb [Prabeesh K] wait for starting the receiver before publishing data
87fc677 [Prabeesh K] address the comments:
97244ec [zsxwing] Make sbt build the assembly test jar for streaming mqtt
80474d1 [Prabeesh K] fix
1f0cfe9 [Prabeesh K] python style fix
e1ee016 [Prabeesh K] scala style fix
a5a8f9f [Prabeesh K] added Python test
9767d82 [Prabeesh K] implemented Python-friendly class
a11968b [Prabeesh K] fixed python style
795ec27 [Prabeesh K] address comments
ee387ae [Prabeesh K] Fix assembly jar location of mqtt-assembly
3f4df12 [Prabeesh K] updated version
b34c3c1 [prabs] adress comments
3aa7fff [prabs] Added Python streaming mqtt word count example
b7d42ff [prabs] Mqtt streaming support in Python

7 years ago[SPARK-9556] [SPARK-9619] [SPARK-9624] [STREAMING] Make BlockGenerator more robust...
Tathagata Das [Thu, 6 Aug 2015 21:35:30 +0000 (14:35 -0700)] 
[SPARK-9556] [SPARK-9619] [SPARK-9624] [STREAMING] Make BlockGenerator more robust and make all BlockGenerators subscribe to rate limit updates

In some receivers, instead of using the default `BlockGenerator` in `ReceiverSupervisorImpl`, custom generator with their custom listeners are used for reliability (see [`ReliableKafkaReceiver`](https://github.com/apache/spark/blob/master/external/kafka/src/main/scala/org/apache/spark/streaming/kafka/ReliableKafkaReceiver.scala#L99) and [updated `KinesisReceiver`](https://github.com/apache/spark/pull/7825/files)). These custom generators do not receive rate updates. This PR modifies the code to allow custom `BlockGenerator`s to be created through the `ReceiverSupervisorImpl` so that they can be kept track and rate updates can be applied.

In the process, I did some simplification, and de-flaki-fication of some rate controller related tests. In particular.
- Renamed `Receiver.executor` to `Receiver.supervisor` (to match `ReceiverSupervisor`)
- Made `RateControllerSuite` faster (by increasing batch interval) and less flaky
- Changed a few internal API to return the current rate of block generators as Long instead of Option\[Long\] (was inconsistent at places).
- Updated existing `ReceiverTrackerSuite` to test that custom block generators get rate updates as well.

Author: Tathagata Das <tathagata.das1565@gmail.com>

Closes #7913 from tdas/SPARK-9556 and squashes the following commits:

41d4461 [Tathagata Das] fix scala style
eb9fd59 [Tathagata Das] Updated kinesis receiver
d24994d [Tathagata Das] Updated BlockGeneratorSuite to use manual clock in BlockGenerator
d70608b [Tathagata Das] Updated BlockGenerator with states and proper synchronization
f6bd47e [Tathagata Das] Merge remote-tracking branch 'apache-github/master' into SPARK-9556
31da173 [Tathagata Das] Fix bug
12116df [Tathagata Das] Add BlockGeneratorSuite
74bd069 [Tathagata Das] Fix style
989bb5c [Tathagata Das] Made BlockGenerator fail is used after stop, and added better unit tests for it
3ff618c [Tathagata Das] Fix test
b40eff8 [Tathagata Das] slight refactoring
f0df0f1 [Tathagata Das] Scala style fixes
51759cb [Tathagata Das] Refactored rate controller tests and added the ability to update rate of any custom block generator

7 years ago[SPARK-7977] [BUILD] Disallowing println
Jonathan Alter [Fri, 10 Jul 2015 10:34:01 +0000 (11:34 +0100)] 
[SPARK-7977] [BUILD] Disallowing println

Author: Jonathan Alter <jonalter@users.noreply.github.com>

Closes #7093 from jonalter/SPARK-7977 and squashes the following commits:

ccd44cc [Jonathan Alter] Changed println to log in ThreadingSuite
7fcac3e [Jonathan Alter] Reverting to println in ThreadingSuite
10724b6 [Jonathan Alter] Changing some printlns to logs in tests
eeec1e7 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
0b1dcb4 [Jonathan Alter] More println cleanup
aedaf80 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
925fd98 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
0c16fa3 [Jonathan Alter] Replacing some printlns with logs
45c7e05 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
5c8e283 [Jonathan Alter] Allowing println in audit-release examples
5b50da1 [Jonathan Alter] Allowing printlns in example files
ca4b477 [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
83ab635 [Jonathan Alter] Fixing new printlns
54b131f [Jonathan Alter] Merge branch 'master' of github.com:apache/spark into SPARK-7977
1cd8a81 [Jonathan Alter] Removing some unnecessary comments and printlns
b837c3a [Jonathan Alter] Disallowing println

7 years ago[SPARK-8444] [STREAMING] Adding Python streaming example for queueStream
Bryan Cutler [Fri, 19 Jun 2015 07:07:53 +0000 (00:07 -0700)] 
[SPARK-8444] [STREAMING] Adding Python streaming example for queueStream

A Python example similar to the existing one for Scala.

Author: Bryan Cutler <bjcutler@us.ibm.com>

Closes #6884 from BryanCutler/streaming-queueStream-example-8444 and squashes the following commits:

435ba7e [Bryan Cutler] [SPARK-8444] Fixed style checks, increased sleep time to show empty queue
257abb0 [Bryan Cutler] [SPARK-8444] Stop context gracefully, Removed unused import, Added description comment
376ef6e [Bryan Cutler] [SPARK-8444] Fixed bug causing DStream.pprint to append empty parenthesis to output instead of blank line
1ff5f8b [Bryan Cutler] [SPARK-8444] Adding Python streaming example for queue_stream

7 years ago[SPARK-7801] [BUILD] Updating versions to SPARK 1.5.0
Patrick Wendell [Wed, 3 Jun 2015 17:11:27 +0000 (10:11 -0700)] 
[SPARK-7801] [BUILD] Updating versions to SPARK 1.5.0

Author: Patrick Wendell <patrick@databricks.com>

Closes #6328 from pwendell/spark-1.5-update and squashes the following commits:

2f42d02 [Patrick Wendell] A few more excludes
4bebcf0 [Patrick Wendell] Update to RC4
61aaf46 [Patrick Wendell] Using new release candidate
55f1610 [Patrick Wendell] Another exclude
04b4f04 [Patrick Wendell] More issues with transient 1.4 changes
36f549b [Patrick Wendell] [SPARK-7801] [BUILD] Updating versions to SPARK 1.5.0

7 years ago[SPARK-3850] Trim trailing spaces for examples/streaming/yarn.
Reynold Xin [Sun, 31 May 2015 07:47:56 +0000 (00:47 -0700)] 
[SPARK-3850] Trim trailing spaces for examples/streaming/yarn.

Author: Reynold Xin <rxin@databricks.com>

Closes #6530 from rxin/trim-whitespace-1 and squashes the following commits:

7b7b3a0 [Reynold Xin] Reset again.
dc14597 [Reynold Xin] Reset scalastyle.
cd556c4 [Reynold Xin] YARN, Kinesis, Flume.
4223fe1 [Reynold Xin] [SPARK-3850] Trim trailing spaces for examples/streaming.

7 years ago[SPARK-7558] Demarcate tests in unit-tests.log
Andrew Or [Fri, 29 May 2015 21:03:12 +0000 (14:03 -0700)] 
[SPARK-7558] Demarcate tests in unit-tests.log

Right now `unit-tests.log` are not of much value because we can't tell where the test boundaries are easily. This patch adds log statements before and after each test to outline the test boundaries, e.g.:

```
===== TEST OUTPUT FOR o.a.s.serializer.KryoSerializerSuite: 'kryo with parallelize for primitive arrays' =====

15/05/27 12:36:39.596 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO SparkContext: Starting job: count at KryoSerializerSuite.scala:230
15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Got job 3 (count at KryoSerializerSuite.scala:230) with 4 output partitions (allowLocal=false)
15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Final stage: ResultStage 3(count at KryoSerializerSuite.scala:230)
15/05/27 12:36:39.596 dag-scheduler-event-loop INFO DAGScheduler: Parents of final stage: List()
15/05/27 12:36:39.597 dag-scheduler-event-loop INFO DAGScheduler: Missing parents: List()
15/05/27 12:36:39.597 dag-scheduler-event-loop INFO DAGScheduler: Submitting ResultStage 3 (ParallelCollectionRDD[5] at parallelize at KryoSerializerSuite.scala:230), which has no missing parents

...

15/05/27 12:36:39.624 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO DAGScheduler: Job 3 finished: count at KryoSerializerSuite.scala:230, took 0.028563 s
15/05/27 12:36:39.625 pool-1-thread-1-ScalaTest-running-KryoSerializerSuite INFO KryoSerializerSuite:

***** FINISHED o.a.s.serializer.KryoSerializerSuite: 'kryo with parallelize for primitive arrays' *****

...
```

Author: Andrew Or <andrew@databricks.com>

Closes #6441 from andrewor14/demarcate-tests and squashes the following commits:

879b060 [Andrew Or] Fix compile after rebase
d622af7 [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
017c8ba [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
7790b6c [Andrew Or] Fix tests after logical merge conflict
c7460c0 [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
c43ffc4 [Andrew Or] Fix tests?
8882581 [Andrew Or] Fix tests
ee22cda [Andrew Or] Fix log message
fa9450e [Andrew Or] Merge branch 'master' of github.com:apache/spark into demarcate-tests
12d1e1b [Andrew Or] Various whitespace changes (minor)
69cbb24 [Andrew Or] Make all test suites extend SparkFunSuite instead of FunSuite
bbce12e [Andrew Or] Fix manual things that cannot be covered through automation
da0b12f [Andrew Or] Add core tests as dependencies in all modules
f7d29ce [Andrew Or] Introduce base abstract class for all test suites

7 years ago[SPARK-7929] Turn whitespace checker on for more token types.
Reynold Xin [Fri, 29 May 2015 06:00:02 +0000 (23:00 -0700)] 
[SPARK-7929] Turn whitespace checker on for more token types.

This is the last batch of changes to complete SPARK-7929.

Previous related PRs:
https://github.com/apache/spark/pull/6480
https://github.com/apache/spark/pull/6478
https://github.com/apache/spark/pull/6477
https://github.com/apache/spark/pull/6476
https://github.com/apache/spark/pull/6475
https://github.com/apache/spark/pull/6474
https://github.com/apache/spark/pull/6473

Author: Reynold Xin <rxin@databricks.com>

Closes #6487 from rxin/whitespace-lint and squashes the following commits:

b33d43d [Reynold Xin] [SPARK-7929] Turn whitespace checker on for more token types.

7 years ago[SPARK-7929] Remove Bagel examples & whitespace fix for examples.
Reynold Xin [Fri, 29 May 2015 03:11:04 +0000 (20:11 -0700)] 
[SPARK-7929] Remove Bagel examples & whitespace fix for examples.

Author: Reynold Xin <rxin@databricks.com>

Closes #6480 from rxin/whitespace-example and squashes the following commits:

8a4a3d4 [Reynold Xin] [SPARK-7929] Remove Bagel examples & whitespace fix for examples.