2017-05-24 67 views
0

新dataframes过滤数据帧我有大约142264行的数据帧:商店在使用python

样本:

   DateAndTime TAGID TagValue UIB 
    2017-04-26 00:00:00.000 100  0.9 NaN 
    2017-04-26 00:00:00.000 101  430.3 NaN 
    2017-04-26 00:00:00.000 102  112.7 NaN 
    2017-04-26 00:00:00.000 103  50.0 NaN 
    2017-04-26 00:00:00.000 104  249.4 NaN 
    2017-04-26 00:00:00.000 105  109.9 NaN 
    2017-04-26 00:00:00.000 106  248.4 NaN 
    2017-04-26 00:00:00.000 107  131.5 NaN 
    2017-04-26 00:00:00.000 108  247.7 NaN 
    2017-04-26 00:00:00.000 109  96.8 NaN 
    2017-04-26 00:00:00.000 113  481.4 NaN 
    2017-04-26 00:00:00.000 114  243.9 NaN 
    2017-04-26 00:00:00.000 115 -416.0 NaN 
    2017-04-26 00:00:00.000 116  -0.5 NaN 
    2017-04-26 00:00:00.000 117  429.2 NaN 
    2017-04-26 00:00:00.000 118  646.4 NaN 
    2017-04-26 00:00:00.000 119  49.5 NaN 
    2017-04-26 00:00:00.000 120  248.2 NaN 
    2017-04-26 00:01:00.000 100  0.9 NaN 
    2017-04-26 00:01:00.000 101  429.7 NaN 
    2017-04-26 00:01:00.000 102  120.0 NaN 
    2017-04-26 00:01:00.000 103  49.9 NaN 
    2017-04-26 00:01:00.000 104  249.2 NaN 
    2017-04-26 00:01:00.000 105  123.8 NaN 
    2017-04-26 00:01:00.000 106  248.3 NaN 
    2017-04-26 00:01:00.000 107  136.3 NaN 
    2017-04-26 00:01:00.000 108  247.4 NaN 
    2017-04-26 00:01:00.000 109  99.9 NaN 
    2017-04-26 00:01:00.000 113  481.4 NaN 
    2017-04-26 00:01:00.000 114  243.9 NaN 

我想过滤的独特标签识别数据帧和单独存储新dataframes。

我想:

data = read_json("json_tagid_100_120.json") 
tagid101 = data[data["TAGID"] == 101] 
print tagid101 

通过这样做,我只能够存储标签识别101

的数据,但我想的个人标签识别的数据存储在一个新的数据帧。

回答

1

我觉得最好的是转换DataFrameGroupBy对象创建所有DataFramesdicttuple再到dict

dfs = dict(tuple(data.groupby("TAGID"))) 

print (dfs[101]) 
      DateAndTime TAGID TagValue UIB 
1 2017-04-26 00:00:00 101  430.3 NaN 
19 2017-04-26 00:01:00 101  429.7 NaN 
+0

我可以用for循环存储新dataframes? like:tags = data ['TAGID']。unique() 然后使用for循环。 – Dheeraj

+0

你认为'df101,df102' ...?如果是的话,它可能在python中,但是很复杂。检查[this](https://stackoverflow.com/q/1373164/2901002) – jezrael

+0

非常感谢。这真的有帮助。 – Dheeraj