2016-12-02 32 views
3

我想使用spark从html表单上载文件。以下是我的Java函数来处理后的路线:Spark Java:无法处理部件,因为没有提供多部分配置

Spark.post("/upload", "multipart/form-data", (request, response) -> { 

     String location = "temporary";   // the directory location where files will be stored 
     long maxFileSize = 100000000;  // the maximum size allowed for uploaded files 
     long maxRequestSize = 100000000; // the maximum size allowed for multipart/form-data requests 
     int fileSizeThreshold = 1024;  // the size threshold after which files will be written to disk 

     MultipartConfigElement multipartConfigElement = new MultipartConfigElement(
      location, maxFileSize, maxRequestSize, fileSizeThreshold); 
     request.raw().setAttribute("org.eclipse.multipartConfig", 
      multipartConfigElement); 

     Collection<Part> parts = request.raw().getParts(); //Line 50 where error is there 
     for (Part part : parts) { 
      System.out.println("Name: " + part.getName()); 
      System.out.println("Size: " + part.getSize()); 
      System.out.println("Filename: " + part.getSubmittedFileName()); 
     } 

     String fName = request.raw().getPart("xmlfile").getSubmittedFileName(); 
     System.out.println("Title: " + request.raw().getParameter("title")); 
     System.out.println("File: " + fName); 

     Part uploadedFile = request.raw().getPart("xmlFile"); 
     Path out = Paths.get("temporary/" + fName); 
     try (final InputStream in = uploadedFile.getInputStream()) { 
      Files.copy(in, out); 
      uploadedFile.delete(); 
     } 
     // cleanup 
     multipartConfigElement = null; 
     //parts = null; 
     uploadedFile = null; 

     return "OK"; 
}); 

以下是HTML表单:

<form class="ui fluid action input" id="fileForm" method="post" action="/sparkapp/upload" enctype = "multipart/form-data"> 
    <input type="text" name="filePath" readonly> 
    <input type="file" name="xmlFile"> 
    <button type="submit" value="Submit"> 
</form> 

当我上传文件,我得到500:与内部服务器错误以下堆栈跟踪:

java.lang.IllegalStateException: Unable to process parts as no multi-part configuration has been provided 
    at org.apache.catalina.connector.Request.parseParts(Request.java:2734) 
    at org.apache.catalina.connector.Request.getParts(Request.java:2701) 
    at org.apache.catalina.connector.Request.getPart(Request.java:2885) 
    at org.apache.catalina.connector.RequestFacade.getPart(RequestFacade.java:1089) 
    at javax.servlet.http.HttpServletRequestWrapper.getPart(HttpServletRequestWrapper.java:362) 
    at com.amulya.Application$2.handle(Application.java:50) 
    at spark.RouteImpl$1.handle(RouteImpl.java:61) 
    at spark.http.matching.Routes.execute(Routes.java:61) 
    at spark.http.matching.MatcherFilter.doFilter(MatcherFilter.java:127) 
    at spark.servlet.SparkFilter.doFilter(SparkFilter.java:173) 
    at org.apache.catalina.core.ApplicationFilterChain.internalDoFilter(ApplicationFilterChain.java:240) 
    at org.apache.catalina.core.ApplicationFilterChain.doFilter(ApplicationFilterChain.java:207) 
    at org.apache.catalina.core.StandardWrapperValve.invoke(StandardWrapperValve.java:212) 
    at org.apache.catalina.core.StandardContextValve.invoke(StandardContextValve.java:106) 
    at org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:502) 
    at org.apache.catalina.core.StandardHostValve.invoke(StandardHostValve.java:141) 
    at org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:79) 
    at org.apache.catalina.valves.AbstractAccessLogValve.invoke(AbstractAccessLogValve.java:616) 
    at org.apache.catalina.core.StandardEngineValve.invoke(StandardEngineValve.java:88) 
    at org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:528) 
    at org.apache.coyote.http11.AbstractHttp11Processor.process(AbstractHttp11Processor.java:1100) 
    at org.apache.coyote.AbstractProtocol$AbstractConnectionHandler.process(AbstractProtocol.java:687) 
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.doRun(NioEndpoint.java:1520) 
    at org.apache.tomcat.util.net.NioEndpoint$SocketProcessor.run(NioEndpoint.java:1476) 
    at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) 
    at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) 
    at org.apache.tomcat.util.threads.TaskThread$WrappingRunnable.run(TaskThread.java:61) 
    at java.lang.Thread.run(Thread.java:745) 

沿袭了以下问题,但答案没有工作: SparkJava: Upload file did't work in Spark java framework

我正在使用eclipse IDE和tomcat服务器。

请帮我解决这个问题。

回答

1

我刚刚发现,因为我正在使用带有spark的tomcat服务器,所以我设置了过滤器,即spark.servlet.SparkFilter

通过this answer我发现,其实,我需要设置在webapp的<Context>元素

allowCasualMultipartParsing="true"

Webapp/META-INF/context.xmlTomcat/conf/server.xml使Tomcat的应自动解析multipart/form-data请求主体时HttpServletRequest.getPart*HttpServletRequest.getParameter*被调用,即使目标servlet没有标记为@MultipartConfig注释。

请参阅参考以下链接:

http://sparkjava.com/documentation.html#other-webserver

https://stackoverflow.com/a/8050589/2256258

http://tomcat.apache.org/tomcat-7.0-doc/config/context.html

https://examples.javacodegeeks.com/enterprise-java/tomcat/tomcat-context-xml-configuration-example/

0

您也可以在里面你的servlet配置提供多配置设置web.xml文件。我列举了下面的一个例子:

<servlet> 
     <servlet-name>MyServlet</servlet-name> 
     <servlet-class>com.example.servlet.MyServlet</servlet-class> 
     <multipart-config> 
       <max-file-size>xxxxx</max-file-size> 
       <max-request-size>yyyyy</max-request-size> 
     </multipart-config> 
</servlet> 
相关问题