2016-07-08 62 views
1

求解: prop.setProperty(“driver”,“oracle.jdbc.driver.OracleDriver”)这行必须添加到连接属性。spark - 线程“main”中的异常java.sql.SQLException:没有合适的驱动程序

我想在当地午餐点火。我用maven创建了一个依赖关系的jar。

这是我的pom.xml

<?xml version="1.0" encoding="UTF-8"?> 
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsd"> 
    <modelVersion>4.0.0</modelVersion> 

    <groupId>com.agildata</groupId> 
    <artifactId>spark-rdd-dataframe-dataset</artifactId> 
    <packaging>jar</packaging> 
    <version>1.0</version> 

    <properties>  
     <exec-maven-plugin.version>1.4.0</exec-maven-plugin.version> 
     <spark.version>1.6.0</spark.version> 
    </properties> 

    <dependencies> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-core_2.11</artifactId> 
      <version>${spark.version}</version> 
     </dependency> 

     <dependency> 
      <groupId>org.apache.spark</groupId> 
      <artifactId>spark-sql_2.11</artifactId> 
      <version>${spark.version}</version> 
     </dependency> 

     <dependency> 
      <groupId>com.oracle</groupId> 
      <artifactId>ojdbc7</artifactId> 
      <version>12.1.0.2</version> 
     </dependency> 





    </dependencies> 

    <build> 
     <plugins> 

      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-compiler-plugin</artifactId> 
       <version>3.2</version> 
       <configuration> 
        <source>1.8</source> 
        <target>1.8</target> 
       </configuration> 
      </plugin> 

      <plugin> 
       <groupId>net.alchim31.maven</groupId> 
       <artifactId>scala-maven-plugin</artifactId> 
       <executions> 
        <execution> 
         <id>scala-compile-first</id> 
         <phase>process-resources</phase> 
         <goals> 
          <goal>add-source</goal> 
          <goal>compile</goal> 
         </goals> 
        </execution> 
        <execution> 
         <id>scala-test-compile</id> 
         <phase>process-test-resources</phase> 
         <goals> 
          <goal>testCompile</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 


      <plugin> 
       <groupId>org.apache.maven.plugins</groupId> 
       <artifactId>maven-assembly-plugin</artifactId> 
       <version>2.4.1</version> 
       <configuration> 
        <!-- get all project dependencies --> 
        <descriptorRefs> 
         <descriptorRef>jar-with-dependencies</descriptorRef> 
        </descriptorRefs> 
        <!-- MainClass in mainfest make a executable jar --> 
        <archive> 
         <manifest> 
          <mainClass>example.dataframe.ScalaDataFrameExample</mainClass> 
         </manifest> 
        </archive> 

       </configuration> 
       <executions> 
        <execution> 
         <id>make-assembly</id> 
         <!-- bind to the packaging phase --> 
         <phase>package</phase> 
         <goals> 
          <goal>single</goal> 
         </goals> 
        </execution> 
       </executions> 
      </plugin> 


     </plugins> 
    </build> 

</project> 

我运行mvn package命令和构建是succesfull。之后我尝试运行像这样的工作:GMAC:bin gabor_dev$ sh spark-submit --class example.dataframe.ScalaDataFrameExample --master spark://QGMAC.local:7077 /Users/gabor_dev/IdeaProjects/dataframe/target/spark-rdd-dataframe-dataset-1.0-jar-with-dependencies.jar但它抛出这个:Exception in thread "main" java.sql.SQLException: No suitable driver

完整的错误消息:

16/07/08 13:09:22 INFO BlockManagerMaster: Registered BlockManager 
Exception in thread "main" java.sql.SQLException: No suitable driver 
    at java.sql.DriverManager.getDriver(DriverManager.java:315) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$$anonfun$2.apply(JdbcUtils.scala:50) 
    at scala.Option.getOrElse(Option.scala:120) 
    at org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils$.createConnectionFactory(JdbcUtils.scala:49) 
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:120) 
    at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:91) 
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:222) 
    at org.apache.spark.sql.DataFrameReader.jdbc(DataFrameReader.scala:146) 
    at example.dataframe.ScalaDataFrameExample$.main(ScalaDataFrameExample.scala:30) 
    at example.dataframe.ScalaDataFrameExample.main(ScalaDataFrameExample.scala) 
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) 
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) 
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) 
    at java.lang.reflect.Method.invoke(Method.java:497) 
    at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731) 
    at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) 
    at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) 
    at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) 
    at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 
16/07/08 13:09:22 INFO SparkContext: Invoking stop() from shutdown hook 

利益的事情,如果我建的IntelliJ IDEA嵌套控制台中这样说:mvn package exec:java -Dexec.mainClass=example.dataframe.ScalaDataFrameExample这是运行,并没有错误。

这是相关Scala代码部分:

val sc = new SparkContext(conf) 

    val sqlContext = new SQLContext(sc) 

    val url="jdbc:oracle:thin:@xxx.xxx.xx:1526:SIDNAME" 

    val prop = new java.util.Properties 

     prop.setProperty("user" , "usertst") 
     prop.setProperty("password" , "usertst") 

     val people = sqlContext.read.jdbc(url,"table_name",prop) 

     people.show() 

我检查了我的jar文件,它containst所有的依赖关系。任何人都可以帮助我解决这个问题吗?谢谢!

+0

复制粘贴? – Vale

+0

复制。请检查问题。 – solarenqu

+0

和IntelliJIdea的作品,更正?你是在集群上还是在本地进行这项工作?本地的 – Vale

回答

5

所以,缺少的驱动程序是JDBC驱动程序,您必须将其添加到SparkSQL配置中。你要么做应用程序提交,按规定by this answer,或者你做你通过Properties对象,像你一样,这一行:错误的相关部分的

prop.setProperty("driver", "oracle.jdbc.driver.OracleDriver") 
+0

Kudos!解决了我的问题:) – bigdatamann

相关问题