2017-09-19 244 views
1

我有一个40M的zip文件,位于网址为http://info/data/bigfile.zip,我想下载到我的本地服务器。目前使用PHP或头文件请求下载这种大小的zip文件的最佳方式是什么,以便它不会在8M时间内超时或给我一个500错误?现在,我不断超时。使用PHP下载40M zip文件

+1

卷曲........... – rtfm

+1

您可以增加超时时间'php.ini'。 – hamed

回答

0

没有给出很多细节,但听起来像php.ini的默认设置限制了服务器通过php web界面及时传输大文件的能力。

即这些设置post_max_sizeupload_max_filesize,或max_execution_time

可能会跳在php.ini文件中,最多的尺寸,重新启动Apache,然后重试文件传输。

HTH

1

为了通过PHP下载大文件尝试这样的事情(来源http://teddy.fr/2007/11/28/how-serve-big-files-through-php/):

<?php 
define('CHUNK_SIZE', 1024*1024); // Size (in bytes) of tiles chunk 

// Read a file and display its content chunk by chunk 
function readfile_chunked($filename, $retbytes = TRUE) { 
    $buffer = ''; 
    $cnt = 0; 
    $handle = fopen($filename, 'rb'); 

    if ($handle === false) { 
     return false; 
    } 

    while (!feof($handle)) { 
     $buffer = fread($handle, CHUNK_SIZE); 
     echo $buffer; 
     ob_flush(); 
     flush(); 

     if ($retbytes) { 
      $cnt += strlen($buffer); 
     } 
    } 

    $status = fclose($handle); 

    if ($retbytes && $status) { 
     return $cnt; // return num. bytes delivered like readfile() does. 
    } 

    return $status; 
} 

// Here goes your code for checking that the user is logged in 
// ... 
// ... 

    $filename = 'path/to/your/file'; // url of your file 
    $mimetype = 'mime/type'; 
    header('Content-Type: '.$mimetype); 
    readfile_chunked($filename); 

?> 

**解决方法二**

在复制文件的一种小块一段时间

/** 
* Copy remote file over HTTP one small chunk at a time. 
* 
* @param $infile The full URL to the remote file 
* @param $outfile The path where to save the file 
*/ 
function copyfile_chunked($infile, $outfile) { 
    $chunksize = 10 * (1024 * 1024); // 10 Megs 

    /** 
    * parse_url breaks a part a URL into it's parts, i.e. host, path, 
    * query string, etc. 
    */ 
    $parts = parse_url($infile); 
    $i_handle = fsockopen($parts['host'], 80, $errstr, $errcode, 5); 
    $o_handle = fopen($outfile, 'wb'); 

    if ($i_handle == false || $o_handle == false) { 
     return false; 
    } 

    if (!empty($parts['query'])) { 
     $parts['path'] .= '?' . $parts['query']; 
    } 

    /** 
    * Send the request to the server for the file 
    */ 
    $request = "GET {$parts['path']} HTTP/1.1\r\n"; 
    $request .= "Host: {$parts['host']}\r\n"; 
    $request .= "User-Agent: Mozilla/5.0\r\n"; 
    $request .= "Keep-Alive: 115\r\n"; 
    $request .= "Connection: keep-alive\r\n\r\n"; 
    fwrite($i_handle, $request); 

    /** 
    * Now read the headers from the remote server. We'll need 
    * to get the content length. 
    */ 
    $headers = array(); 
    while(!feof($i_handle)) { 
     $line = fgets($i_handle); 
     if ($line == "\r\n") break; 
     $headers[] = $line; 
    } 

    /** 
    * Look for the Content-Length header, and get the size 
    * of the remote file. 
    */ 
    $length = 0; 
    foreach($headers as $header) { 
     if (stripos($header, 'Content-Length:') === 0) { 
      $length = (int)str_replace('Content-Length: ', '', $header); 
      break; 
     } 
    } 

    /** 
    * Start reading in the remote file, and writing it to the 
    * local file one chunk at a time. 
    */ 
    $cnt = 0; 
    while(!feof($i_handle)) { 
     $buf = ''; 
     $buf = fread($i_handle, $chunksize); 
     $bytes = fwrite($o_handle, $buf); 
     if ($bytes == false) { 
      return false; 
     } 
     $cnt += $bytes; 

     /** 
     * We're done reading when we've reached the conent length 
     */ 
     if ($cnt >= $length) break; 
    } 

    fclose($i_handle); 
    fclose($o_handle); 
    return $cnt; 
} 

调整$ chunksize变量以满足您的需要。这只是轻度测试。由于多种原因,它很容易破裂。

用法:

copyfile_chunked('http://somesite.com/somefile.jpg', '/local/path/somefile.jpg');