程序師世界是廣大編程愛好者互助、分享、學習的平台,程序師世界有你更精彩!
首頁
編程語言
C語言|JAVA編程
Python編程
網頁編程
ASP編程|PHP編程
JSP編程
數據庫知識
MYSQL數據庫|SqlServer數據庫
Oracle數據庫|DB2數據庫
您现在的位置: 程式師世界 >> 編程語言 >  >> 更多編程語言 >> Python

Python learning notes (4)

編輯:Python
  1. Xavier:

    1. The basic idea is that through the network layer , The variance of input and output is the same , Including forward propagation and backward propagation .

    2. If the initialization value is very small , So with the transfer of layers , Variance tends to 0, Enter the value It's getting smaller and smaller , stay sigmoid It's in the 0 near , Close to linear , Without nonlinearity

    3. If the initial value is large , So with the transfer of layers , The variance will increase rapidly , At this point, the input value becomes large , and sigmoid Writing a reciprocal at a large input tends to 0, Back propagation will encounter the problem of gradient disappearance

    4. Sense and BN The purpose of using is similar

  2. np relevant :
    1. np.newaxis: Insert new dimension

    2. list and ndarray Interturn :

      1. list turn numpy:np.array(a)

      2. ndarray turn list:a.tolist()

    3. find list Middle front k A relatively large number :heapq.nlargest(k,list)
    4. numpy.ndarray And string to string :

      1. str=arr.tostring()

      2. arr=np.frombuffer(string, dtype=np.float32)

  3. Package learning reference link :
    1. pydub Learning reference for :https://blog.csdn.net/Debatrix/article/details/59058762

    2.  pyworld The use of reference :https://blog.csdn.net/m0_43395719/article/details/107930075


  1. 上一篇文章:
  2. 下一篇文章:
Copyright © 程式師世界 All Rights Reserved