Skip to content

Commit 95f5221

Browse files
authored
update oneDNN Graph API to v3.1 (#484)
1 parent 97aeffd commit 95f5221

4 files changed

Lines changed: 152 additions & 0 deletions

File tree

source/elements/oneDNN/include/dnnl_graph.hpp

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -115,6 +115,10 @@ inline dnnl::engine make_engine_with_allocator(dnnl::engine::kind kind,
115115

116116
/// Logical tensor object
117117
struct logical_tensor {
118+
/// Integer type for representing dimension sizes and indices.
119+
using dim = dnnl_dim_t;
120+
/// Vector of dimensions. Implementations are free to force a limit on the
121+
/// vector's length.
118122
using dims = std::vector<dim>;
119123

120124
/// Data Type
@@ -431,6 +435,8 @@ struct op {
431435
Exp ,
432436
GELU ,
433437
GELUBackward ,
438+
HardSigmoid ,
439+
HardSigmoidBackward ,
434440
HardSwish ,
435441
HardSwishBackward ,
436442
Interpolate ,
Lines changed: 71 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,71 @@
1+
.. SPDX-FileCopyrightText: 2020-2023 Intel Corporation
2+
..
3+
.. SPDX-License-Identifier: CC-BY-4.0
4+
5+
.. include:: ../../replacements.inc.rst
6+
7+
8+
HardSigmoid
9+
###########
10+
11+
HardSigmoid operation applies the following formula on every element of \src
12+
tensor (the variable names follow the standard @ref dev_guide_conventions):
13+
14+
.. math::
15+
16+
\dst = \max(0, \min(1, \alpha \src + \beta))
17+
18+
Operation Attributes
19+
********************
20+
21+
+--------------+----------------+------------+--------------+-------------+
22+
| Attribute | Description | Value Type | Supported | Required or |
23+
| Name | | | Values | Optional |
24+
+==============+================+============+==============+=============+
25+
| |attr_alpha| | :math:`\alpha` | f32 | Arbitrary | Required |
26+
| | in the | | f32 value | |
27+
| | formula. | | | |
28+
+--------------+----------------+------------+--------------+-------------+
29+
| |attr_beta| | :math:`\beta` | f32 | Arbitrary | Required |
30+
| | in the | | f32 value | |
31+
| | formula. | | | |
32+
+--------------+----------------+------------+--------------+-------------+
33+
34+
Execution Arguments
35+
*******************
36+
37+
The inputs and outputs must be provided according to the below index order
38+
when constructing an operation.
39+
40+
Inputs
41+
======
42+
43+
44+
===== ============= ====================
45+
Index Argument Name Required or Optional
46+
===== ============= ====================
47+
0 ``src`` Required
48+
===== ============= ====================
49+
50+
Outputs
51+
=======
52+
53+
54+
===== ============= ====================
55+
Index Argument Name Required or Optional
56+
===== ============= ====================
57+
0 ``dst`` Required
58+
===== ============= ====================
59+
60+
Supported Data Types
61+
********************
62+
63+
HardSigmoid operation supports the following data type combinations.
64+
65+
==== ====
66+
Src Dst
67+
==== ====
68+
f32 f32
69+
bf16 bf16
70+
f16 f16
71+
==== ====
Lines changed: 73 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,73 @@
1+
.. SPDX-FileCopyrightText: 2020-2023 Intel Corporation
2+
..
3+
.. SPDX-License-Identifier: CC-BY-4.0
4+
5+
.. include:: ../../replacements.inc.rst
6+
7+
HardSigmoidBackward
8+
###################
9+
10+
HardSigmoidBackward operation computes gradient for HardSigmoid. The formula is
11+
defined as follows:
12+
13+
.. math::
14+
15+
\diffsrc = \begin{cases} \diffdst \cdot \alpha & \text{if}\ 0 < \alpha \src + \beta < 1 \\ 0 & \text{otherwise}\ \end{cases}
16+
17+
Operation Attributes
18+
********************
19+
20+
+--------------+----------------+------------+--------------+-------------+
21+
| Attribute | Description | Value Type | Supported | Required or |
22+
| Name | | | Values | Optional |
23+
+==============+================+============+==============+=============+
24+
| |attr_alpha| | :math:`\alpha` | f32 | Arbitrary | Required |
25+
| | in the | | f32 value | |
26+
| | formula. | | | |
27+
+--------------+----------------+------------+--------------+-------------+
28+
| |attr_beta| | :math:`\beta` | f32 | Arbitrary | Required |
29+
| | in the | | f32 value | |
30+
| | formula. | | | |
31+
+--------------+----------------+------------+--------------+-------------+
32+
33+
34+
Execution Arguments
35+
*******************
36+
37+
The inputs and outputs must be provided according to the below index order
38+
when constructing an operation.
39+
40+
Inputs
41+
======
42+
43+
44+
===== ============= ====================
45+
Index Argument Name Required or Optional
46+
===== ============= ====================
47+
0 ``src`` Required
48+
1 ``diff_dst`` Required
49+
===== ============= ====================
50+
51+
Outputs
52+
=======
53+
54+
55+
===== ============= ====================
56+
Index Argument Name Required or Optional
57+
===== ============= ====================
58+
0 ``diff_src`` Required
59+
===== ============= ====================
60+
61+
62+
Supported Data Types
63+
********************
64+
65+
HardSigmoidBackward operation supports the following data type combinations.
66+
67+
======== ======== =========
68+
Src Diff_dst Diff_src
69+
======== ======== =========
70+
f32 f32 f32
71+
f16 f16 f16
72+
bf16 bf16 bf16
73+
======== ======== =========

source/elements/oneDNN/source/graph/ops/index.rst

Lines changed: 2 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -40,6 +40,8 @@ subset of the operation set.
4040
Exp.rst
4141
GELU.rst
4242
GELUBackward.rst
43+
HardSigmoid.rst
44+
HardSigmoidBackward.rst
4345
HardSwish.rst
4446
HardSwishBackward.rst
4547
Interpolate.rst

0 commit comments

Comments
 (0)