2013-07-06 145 views
7

我对我的静态文件使用django-storage和amazon s3。下面的文档,我把这些设置在我的settings.pyDjango存储未检测到更改的静态文件

STATIC_URL = 'https://mybucket.s3.amazonaws.com/' 

ADMIN_MEDIA_PREFIX = 'https://mybucket.s3.amazonaws.com/admin/' 

INSTALLED_APPS += (
    'storages', 
) 

DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage' 
AWS_ACCESS_KEY_ID = 'mybucket_key_id' 
AWS_SECRET_ACCESS_KEY = 'mybucket_access_key' 
AWS_STORAGE_BUCKET_NAME = 'mybucket' 
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage' 

而且我第一次运行采集静态一切运行正常,我的静态文件上传到我的S3桶。

然而,在修改我的静态文件和运行python manage.py collectstatic这尽管静态文件进行了修改

-----> Collecting static files 
    0 static files copied, 81 unmodified. 

不过,如果我重新命名更改的静态文件输出后,改变的静态文件是正确的复制到我的s3桶。

为什么django-storages上传我更改的静态文件?有配置问题还是问题更深?

回答

12

如果“目标”文件比源文件“更年轻”,collectstatic将跳过文件。看起来像亚马逊S3存储为您的文件返回错误的日期。

您可以调查[code] [1]并调试服务器响应。也许时区有问题。

或者你可以只通过--clear参数collectstatic使所有文件都删S3收集

+0

由于这个工作。我要等一下,看看是否有人发布我的问题的确切解决方案,但如果不是,你可以有+50 :) – bab

+0

- 清除似乎没有与我一起工作s3。如果我手动删除s3中的文件,它们都会被重新复制。 – mgojohn

5

https://github.com/antonagestam/collectfast

从之前的readme.txt:该MD5校验和ETag从比较自定义管理命令S3和如果两个是相同的跳过文件副本。如果您使用git作为更新时间戳的源控制系统,这使得运行收集静态数据的速度更快 。

0

这个问题有点老了,但如果它在未来帮助某人,我想我会分享我的经验。以下建议发现在其他线程我证实,对我来说,这确实是由时区差异造成的。我的django时间没有错误,但设置为EST,S3设置为GMT。在测试中,我回到了django-storages 1.1.5,它似乎确实可以运行collectstatic。部分由于个人偏好,我不愿意a)回滚三个版本的django存储并丢失任何潜在的错误修复,或者b)更改我的项目组件的时区,以便基本归结为便利功能(虽然这是一个重要的一)。

我写了一个简短的脚本来完成与collectstatic相同的工作,但没有上述改动。它将需要对应用程序进行一些修改,但如果将其放置在应用程序级别并将“static_dirs”替换为项目应用程序的名称,则应该适用于标准情况。它通过终端运行'python whatever_you_call_it.py -e environment_name(把它设置为你的aws桶)。与这条线的具体设置

TIME_ZONE = 'UTC' 

运行collectstatic:

import sys, os, subprocess 
import boto3 
import botocore 
from boto3.session import Session 
import argparse 
import os.path, time 
from datetime import datetime, timedelta 
import pytz 

utc = pytz.UTC 
DEV_BUCKET_NAME = 'dev-homfield-media-root' 
PROD_BUCKET_NAME = 'homfield-media-root' 
static_dirs = ['accounts', 'messaging', 'payments', 'search', 'sitewide'] 

def main(): 
    try: 
     parser = argparse.ArgumentParser(description='Homfield Collectstatic. Our version of collectstatic to fix django-storages bug.\n') 
     parser.add_argument('-e', '--environment', type=str, required=True, help='Name of environment (dev/prod)') 
     args = parser.parse_args() 
     vargs = vars(args) 
     if vargs['environment'] == 'dev': 
      selected_bucket = DEV_BUCKET_NAME 
      print "\nAre you sure? You're about to push to the DEV bucket. (Y/n)" 
     elif vargs['environment'] == 'prod': 
      selected_bucket = PROD_BUCKET_NAME 
      print "Are you sure? You're about to push to the PROD bucket. (Y/n)" 
     else: 
      raise ValueError 

     acceptable = ['Y', 'y', 'N', 'n'] 
     confirmation = raw_input().strip() 
     while confirmation not in acceptable: 
      print "That's an invalid response. (Y/n)" 
      confirmation = raw_input().strip() 

     if confirmation == 'Y' or confirmation == 'y': 
      run(selected_bucket) 
     else: 
      print "Collectstatic aborted." 
    except Exception as e: 
     print type(e) 
     print "An error occured. S3 staticfiles may not have been updated." 


def run(bucket_name): 

    #open session with S3 
    session = Session(aws_access_key_id='{aws_access_key_id}', 
     aws_secret_access_key='{aws_secret_access_key}', 
     region_name='us-east-1') 
    s3 = session.resource('s3') 
    bucket = s3.Bucket(bucket_name) 

    # loop through static directories 
    for directory in static_dirs: 
     rootDir = './' + directory + "/static" 
     print('Checking directory: %s' % rootDir) 

     #loop through subdirectories 
     for dirName, subdirList, fileList in os.walk(rootDir): 
      #loop through all files in subdirectory 
      for fname in fileList: 
       try: 
        if fname == '.DS_Store': 
         continue 

        # find and qualify file last modified time 
        full_path = dirName + "/" + fname 
        last_mod_string = time.ctime(os.path.getmtime(full_path)) 
        file_last_mod = datetime.strptime(last_mod_string, "%a %b %d %H:%M:%S %Y") + timedelta(hours=5) 
        file_last_mod = utc.localize(file_last_mod) 

        # truncate path for S3 loop and find object, delete and update if it has been updates 
        s3_path = full_path[full_path.find('static'):] 
        found = False 
        for key in bucket.objects.all(): 
         if key.key == s3_path: 
          found = True 
          last_mode_date = key.last_modified 
          if last_mode_date < file_last_mod: 
           key.delete() 
           s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path)) 
           print "\tUpdated : " + full_path 
        if not found: 
         # if file not found in S3 it is new, send it up 
         print "\tFound a new file. Uploading : " + full_path 
         s3.Object(bucket_name, s3_path).put(Body=open(full_path, 'r'), ContentType=get_mime_type(full_path)) 
       except: 
        print "ALERT: Big time problems with: " + full_path + ". I'm bowin' out dawg, this shitz on u." 


def get_mime_type(full_path): 
    try: 
     last_index = full_path.rfind('.') 
     if last_index < 0: 
      return 'application/octet-stream' 
     extension = full_path[last_index:] 
     return { 
      '.js' : 'application/javascript', 
      '.css' : 'text/css', 
      '.txt' : 'text/plain', 
      '.png' : 'image/png', 
      '.jpg' : 'image/jpeg', 
      '.jpeg' : 'image/jpeg', 
      '.eot' : 'application/vnd.ms-fontobject', 
      '.svg' : 'image/svg+xml', 
      '.ttf' : 'application/octet-stream', 
      '.woff' : 'application/x-font-woff', 
      '.woff2' : 'application/octet-stream' 
     }[extension] 
    except: 
     'ALERT: Couldn\'t match mime type for '+ full_path + '. Sending to S3 as application/octet-stream.' 

if __name__ == '__main__': 
    main() 
2

创建一个设置文件只是为collectstatic同步,这种配置

python manage.py collectstatic --settings=settings.collectstatic 
+1

这解决了我的问题。 – kmomo