2015-06-15 16 views
0

我不明白super关键字在子类中未使用时的含义。父类中super关键字的含义python

的问题在那里,我的工作来自这个类在这里,我的Git枢纽工程发现(该链接https://github.com/statsmodels/statsmodels/pull/2374/files

查找例如在fit方法,其中的代码res = super(PenalizedMixin, self).fit(method=method, **kwds) +出现

""" 
+Created on Sun May 10 08:23:48 2015 
+ 
+Author: Josef Perktold 
+License: BSD-3 
+""" 
+ 
+import numpy as np 
+from ._penalties import SCADSmoothed 
+ 
+class PenalizedMixin(object): 
+ """Mixin class for Maximum Penalized Likelihood 
+ 
+ 
+ TODO: missing **kwds or explicit keywords 
+ 
+ TODO: do we really need `pen_weight` keyword in likelihood methods? 
+ 
+ """ 
+ 
+ def __init__(self, *args, **kwds): 
+  super(PenalizedMixin, self).__init__(*args, **kwds) 
+ 
+  penal = kwds.pop('penal', None) 
+  # I keep the following instead of adding default in pop for future changes 
+  if penal is None: 
+   # TODO: switch to unpenalized by default 
+   self.penal = SCADSmoothed(0.1, c0=0.0001) 
+  else: 
+   self.penal = penal 
+ 
+  # TODO: define pen_weight as average pen_weight? i.e. per observation 
+  # I would have prefered len(self.endog) * kwds.get('pen_weight', 1) 
+  # or use pen_weight_factor in signature 
+  self.pen_weight = kwds.get('pen_weight', len(self.endog)) 
+ 
+  self._init_keys.extend(['penal', 'pen_weight']) 
+ 
+ 
+ 
+ def loglike(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+ 
+  llf = super(PenalizedMixin, self).loglike(params) 
+  if pen_weight != 0: 
+   llf -= pen_weight * self.penal.func(params) 
+ 
+  return llf 
+ 
+ 
+ def loglikeobs(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+ 
+  llf = super(PenalizedMixin, self).loglikeobs(params) 
+  nobs_llf = float(llf.shape[0]) 
+ 
+  if pen_weight != 0: 
+   llf -= pen_weight/nobs_llf * self.penal.func(params) 
+ 
+  return llf 
+ 
+ 
+ def score(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+ 
+  sc = super(PenalizedMixin, self).score(params) 
+  if pen_weight != 0: 
+   sc -= pen_weight * self.penal.grad(params) 
+ 
+  return sc 
+ 
+ 
+ def scoreobs(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+ 
+  sc = super(PenalizedMixin, self).scoreobs(params) 
+  nobs_sc = float(sc.shape[0]) 
+  if pen_weight != 0: 
+   sc -= pen_weight/nobs_sc * self.penal.grad(params) 
+ 
+  return sc 
+ 
+ 
+ def hessian_(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+   loglike = self.loglike 
+  else: 
+   loglike = lambda p: self.loglike(p, pen_weight=pen_weight) 
+ 
+  from statsmodels.tools.numdiff import approx_hess 
+  return approx_hess(params, loglike) 
+ 
+ 
+ def hessian(self, params, pen_weight=None): 
+  if pen_weight is None: 
+   pen_weight = self.pen_weight 
+ 
+  hess = super(PenalizedMixin, self).hessian(params) 
+  if pen_weight != 0: 
+   h = self.penal.deriv2(params) 
+   if h.ndim == 1: 
+    hess -= np.diag(pen_weight * h) 
+   else: 
+    hess -= pen_weight * h 
+ 
+  return hess 
+ 
+ 
+ def fit(self, method=None, trim=None, **kwds): 
+  # If method is None, then we choose a default method ourselves 
+ 
+  # TODO: temporary hack, need extra fit kwds 
+  # we need to rule out fit methods in a model that will not work with 
+  # penalization 
+  if hasattr(self, 'family'): # assume this identifies GLM 
+   kwds.update({'max_start_irls' : 0}) 
+ 
+  # currently we use `bfgs` by default 
+  if method is None: 
+   method = 'bfgs' 
+ 
+  if trim is None: 
+   trim = False # see below infinite recursion in `fit_constrained 
+ 
+  res = super(PenalizedMixin, self).fit(method=method, **kwds) 
+ 
+  if trim is False: 
+   # note boolean check for "is False" not evaluates to False 
+   return res 
+  else: 
+   # TODO: make it penal function dependent 
+   # temporary standin, only works for Poisson and GLM, 
+   # and is computationally inefficient 
+   drop_index = np.nonzero(np.abs(res.params) < 1e-4) [0] 
+   keep_index = np.nonzero(np.abs(res.params) > 1e-4) [0] 
+   rmat = np.eye(len(res.params))[drop_index] 
+ 
+   # calling fit_constrained raise 
+   # "RuntimeError: maximum recursion depth exceeded in __instancecheck__" 
+   # fit_constrained is calling fit, recursive endless loop 
+   if drop_index.any(): 
+    # todo : trim kwyword doesn't work, why not? 
+    #res_aux = self.fit_constrained(rmat, trim=False) 
+    res_aux = self._fit_zeros(keep_index, **kwds) 
+    return res_aux 
+   else: 
+    return res 
+ 
+ 

我试图复制该代码与一个简单的例子,但它不工作:

class A(object): 
    def __init__(self): 
     return 

    def funz(self, x): 
     print(x) 

    def funz2(self, x): 
     llf = super(A, self).funz2(x) 
     print(x + 1) 

a = A() 
a.funz(3) 
a.funz2(4) 


Traceback (most recent call last): 
    File "<stdin>", line 1, in <module> 
    File "/home/donbeo/Desktop/prova.py", line 15, in <module> 
    a.funz2(4) 
    File "/home/donbeo/Desktop/prova.py", line 10, in funz2 
    llf = super(A, self).funz2(x) 
AttributeError: 'super' object has no attribute 'funz2' 
>>> 
+0

[在从'object'派生的类中调用super().__ init __()](http://stackoverflow.com/questions/6796996/call-super-init-in-classes-derived-from-object) – vaultah

+0

相关:http://stackoverflow.com/q/30041679/3001761 – jonrsharpe

回答

2

你应该总是在一个多重继承的情况下使用super,否则类可能会错过了,特别(正在使用的混合型类,其中这是必然的)。例如:

class BaseClass(object): 

    def __init__(self): 
     print 'BaseClass.__init__' 


class MixInClass(object): 

    def __init__(self): 
     print 'MixInClass.__init__' 


class ChildClass(BaseClass, MixInClass): 

    def __init__(self): 
     print 'ChildClass.__init__' 
     super(ChildClass, self).__init__() # -> BaseClass.__init__ 


if __name__ == '__main__': 
    child = ChildClass() 

给出:

ChildClass.__init__ 
BaseClass.__init__ 

错失MixInClass.__init__,而:

class BaseClass(object): 

    def __init__(self): 
     print 'BaseClass.__init__' 
     super(BaseClass, self).__init__() # -> MixInClass.__init__ 


class MixInClass(object): 

    def __init__(self): 
     print 'MixInClass.__init__' 
     super(MixInClass, self).__init__() # -> object.__init__ 


class ChildClass(BaseClass, MixInClass): 

    def __init__(self): 
     print 'ChildClass.__init__' 
     super(ChildClass, self).__init__() # -> BaseClass.__init__ 


if __name__ == '__main__': 
    child = ChildClass() 

给出:

ChildClass.__init__ 
BaseClass.__init__ 
MixInClass.__init__ 

ChildClass.__mro__,将“方法解析顺序”,在两种情况下是相同的:

(<class '__main__.ChildClass'>, <class '__main__.BaseClass'>, <class '__main__.MixInClass'>, <type 'object'>) 

两者BaseClass并且仅从objectMixInClass继承(即它们是“新风格”类),但仍需要使用super以确保调用MRO中的类中方法的任何其他实现。要启用此功能,请执行object.__init__,但实际上并没有多大作用!

+0

所以这样'ChildClass .__ init __()'调用父母的所有'__init__'方法?即使是20个父母班? – Donbeo

+0

@Donbeo不完全 - “ChildClass .__ init__”调用'super',这会得到** next *实现*“up”* MRO。接下来是下一个实现(本例中为'BaseClass .__ init__'),以决定它是否应该是最后一个调用,或者是否应该调用'super'来获得下一个实现。如果有20个父类和**所有'__init__'s调用'super' **,那么将调用所有20个方法(但每个只有一个)。 – jonrsharpe

+0

所以当它遇到一个'__init__'方法不使用超级关键字的类时它会停止? – Donbeo

3

罚款密码小孩班:它是object的孩子。

然而,顾名思义,它意味着一个mixin。也就是说,它旨在用作多继承场景中的一个父代。 super以方法解析顺序调用下一个类,该顺序不一定是该类的父类。

无论如何,我不明白你的“更简单”的例子。原始代码工作的原因是超类有一个__init__方法。 object没有funz2方法。

+0

你能举个简单的例子吗?可能我不清楚使用'object'关键字。 – Donbeo

+0

这将帮助一个简单的示例,显示在多继承场景中使用此构造。 – Donbeo

+2

我一直提到这一点,但它是迄今为止最好的解释:[super super super](https://rhettinger.wordpress.com/2011/05/26/super-considered-super/)。 –

相关问题