Details

    • Type: Bug
    • Status: Resolved
    • Priority: Blocker
    • Resolution: Fixed
    • Affects Version/s: 0.9.0
    • Fix Version/s: 1.0.0, 0.9.1
    • Component/s: Spark Core
    • Labels:
      None

      Description

      Using the ADD_JARS environment variable with spark-shell used to add the jar to both the shell and the various workers. Now it only adds to the workers and importing a custom class in the shell is broken.

      The workaround is to add custom jars to both ADD_JARS and SPARK_CLASSPATH.

      We should fix ADD_JARS so it works properly again.

      See various threads on the user list:
      https://mail-archives.apache.org/mod_mbox/incubator-spark-user/201402.mbox/%3CCAJbo4neMLiTrnm1XbyqomWmp0m+EUcg4yE-txuRGSVKOb5KLeA@mail.gmail.com%3E
      (another one that doesn't appear in the archives yet titled "ADD_JARS not working on 0.9")

        Attachments

          Activity

            People

            • Assignee:
              CodingCat Nan Zhu
              Reporter:
              ash211 Andrew Ash
            • Votes:
              1 Vote for this issue
              Watchers:
              4 Start watching this issue

              Dates

              • Created:
                Updated:
                Resolved: