Skip to content

Release-3.2.0

Latest
Compare
Choose a tag to compare
@ouyangwen-it ouyangwen-it released this 18 Aug 13:41
· 19 commits to branch-3.2.0 since this release

Summary

In version 3.2.0, Angel continues to strengthen the ability of graph computing. Compared with the previous version, we have done a lot of optimizations and provided some new features, include:

  • Graph computing layered abstraction and flexible expansion
  • Parameter server and MPI mixed running mode
  • Adaptive model partitioning
  • Support complex heterogeneous Graph Embedding
  • High-performance optimization of large graphs with hundreds of billions of edges
  • Enrich the content of machine learning algorithm library

New Features

  • Graph computing layered abstraction and flexible expansion

    • Provides three layers of abstraction: graph computing engine layer, graph operation operator layer, and graph algorithm layer
    • Provides more than a dozen commonly used operator abstractions such as init, get, walker, sample, etc., as well as custom operator interfaces
  • Parameter server and MPI mixed running mode

    • Provides the operating mode of starting Angel PS in Worker (or Executor) in an embedded way. In one model, PS mode and MPI common ring communication topology can be used at the same time.
  • Adaptive model partitioning

    • Optimize the partition routing method of the data server model, which can support the partition of range and hash at the same time. In the actual graph algorithm training process, the appropriate model partition method can be adaptively selected according to the calculation characteristics of different algorithms
  • Complex heterogeneous Graph Embedding

    • Enriched and expanded the storage structure and calculation mode of the graph, and provided a flexible custom ps func interface for complex operations, which can well support the storage and calculation of complex heterogeneous graph networks, and can support high-dimensional sparse graph node features. Easily perform representation learning of heterogeneous graphs
    • Implemented 5 out-of-the-box heterogeneous graph neural network algorithms: HAN, heterogeneous GAT, heterogeneous GraphSage, IGMC edge prediction, and heterogeneous Bipartite GraphSage
  • High-performance optimization of large graphs with hundreds of billions of edges

    • Special performance optimizations have been made for the training of hundreds of billions of edges, and the test results of the k-core and common friends algorithm are provided.
  • Enrich the content of machine learning algorithm library

    • Added more than a dozen feature engineering methods such as Correlation, Discrete,MutualInformationRandomizedSVD, etc., and a multi-task learning algorithm esmm