POSTRISC virtual processor instruction set

Register files

128-bit general purpose registers (128 from 128)
r0 r1 r2 r3 r4 r5 r6 r7
r8 r9 r10 r11 r12 r13 r14 r15
r16 r17 r18 r19 r20 r21 r22 r23
r24 r25 r26 r27 r28 r29 r30 r31
r32 r33 r34 r35 r36 r37 r38 r39
r40 r41 r42 r43 r44 r45 r46 r47
r48 r49 r50 r51 r52 r53 r54 r55
r56 r57 r58 r59 r60 r61 r62 r63
r64 r65 r66 r67 r68 r69 r70 r71
r72 r73 r74 r75 r76 r77 r78 r79
r80 r81 r82 r83 r84 r85 r86 r87
r88 r89 r90 r91 r92 r93 r94 r95
r96 r97 r98 r99 r100 r101 r102 r103
r104 r105 r106 r107 r108 r109 r110 r111
r112 r113 r114 r115 r116 r117 r118 r119
g0 g1 g2 g3 tp fp sp gz

64/128-bit special purpose registers (39 from 128)
ip (0) eip (1) fpcr (2) eca (3) 4 5 6 7
rsc (8) rsp (9) bsp (10) 11 12 13 14 15
psr (16) reip (17) kip (18) ksp (19) krsp (20) peb (21) teb (22) itc (23)
itm (24) pta (25) iva (26) 27 28 29 30 31
32 33 34 35 36 37 38 39
iip (40) iipa (41) ipsr (42) cause (43) ifa (44) iib (45) 46 47
48 49 50 51 52 53 54 55
56 57 58 59 60 61 62 63
irr0 (64) irr1 (65) irr2 (66) irr3 (67) 68 69 70 71
isr0 (72) isr1 (73) isr2 (74) isr3 (75) 76 77 78 79
iv (80) lid (81) tpr (82) itcv (83) tsv (84) pmv (85) cmcv (86) 87
88 89 90 91 92 93 94 95
96 97 98 99 100 101 102 103
104 105 106 107 108 109 110 111
112 113 114 115 116 117 118 119
120 121 122 123 124 125 126 127

Instruction fields/arguments

color descriptions for instruction fields:
primary opcode
extended opcode
general-purpose register number
special-purpose register number
immediate constant
shift (bit count)
modifier
reserved (must be zero)

Instruction formats

bundle formats:
slot 3
(42 bits)
slot 2
(42 bits)
slot 1
(42 bits)
template
(2 bits)
short 3short 2short 100
long 2short 101
short 2long 110
very long11
slot formats:
format
name
bit numbers
41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
ld_imm opcode ra simm28
call opcode ra simm28x16
mem_iprel opcode ra uimm28
write opcode opx uimm28
jmp opcode opx simm28x16
nop opcode opx simm28
alloc opcode opx framesize 0
alloc_sp opcode opx framesize uimm21
retf opcode opx 0 uimm21
cache_simm opcode opx rb simm21
bin_simm opcode ra rb simm21
bin_uimm opcode ra rb uimm21
loop opcode ra rb opx uimm6 simm11x16
br_eh opcode opx rb 0 simm17x16
br_rr opcode ra rb opx simm17x16
br_rs opcode ra sb opx simm17x16
br_simm opcode ra simm11 simm17x16
br_uimm opcode ra uimm11 simm17x16
nul_simm opcode ra simm11 dn dy opx
nul_uimm opcode ra uimm11 dn dy opx
nul_rs opcode ra sb opx dn dy opx
nul_rr opcode ra rb opx dn dy opx
mid_bin_simm opcode ra rb simm14 opx
r4 opcode ra rb rc rd opx
r3s1 opcode ra rb rc sd opx
r2s2 opcode ra rb sc sd opx
r3s2 opcode ra rb rc sd se
r4rm opcode ra rb rc rd opx rm
r4mo opcode ra rb rc rd opx mo
gmemx opcode ra rb rc simm7 opx scale
r3scale opcode ra rb rc opx scale
RbcScale opcode 0 rb rc opx scale
Rbc opcode 0 rb rc opx 0
mspr opcode ra 0 spr opx 0
r2 opcode ra rb 0 opx 0
r2s1 opcode ra rb sc opx 0
r3 opcode ra rb rc opx 0
r2rm opcode ra rb 0 opx rm
r3rm opcode ra rb rc opx rm
r2mo opcode ra rb 0 opx mo
r3mo opcode ra rb rc opx mo
fpclass opcode ra rb uimm10 opx imm
gmemu opcode ra rb simm10 opx imm
int opcode 0 rb simm10 opx imm
fence opcode 0 opx mo
NoArgs opcode 0 opx 0

Instruction mnemonics and syntax

group syntax description
f128abs_diff_f128 ra,rb,rc,rmabsolute difference f128
f16abs_diff_f16 ra,rb,rc,rmabsolute difference f16
f32abs_diff_f32 ra,rb,rc,rmabsolute difference f32
f64abs_diff_f64 ra,rb,rc,rmabsolute difference f64
i128abs_diff_i128 ra,rb,rcabsolute difference i128
baseabs_diff_i32 ra,rb,rcabsolute difference i32
baseabs_diff_i64 ra,rb,rcabsolute difference i64
f16abs_diff_vf16 ra,rb,rc,rmabsolute difference vf16
f32abs_diff_vf32 ra,rb,rc,rmabsolute difference vf32
f64abs_diff_vf64 ra,rb,rc,rmabsolute difference vf64
f128abs_f128 ra,rbabsolute value f128
f16abs_f16 ra,rbabsolute value f16
f32abs_f32 ra,rbabsolute value f32
f64abs_f64 ra,rbabsolute value f64
i128abs_i128 ra,rbabsolute value i128
baseabs_i32 ra,rbabsolute value i32
baseabs_i64 ra,rbabsolute value i64
f128abs_max_f128 ra,rb,rcabsolute maximum f128
f16abs_max_f16 ra,rb,rcabsolute maximum f16
f32abs_max_f32 ra,rb,rcabsolute maximum f32
f64abs_max_f64 ra,rb,rcabsolute maximum f64
f16abs_max_vf16 ra,rb,rcabsolute maximum vf16
f32abs_max_vf32 ra,rb,rcabsolute maximum vf32
f64abs_max_vf64 ra,rb,rcabsolute maximum vf64
f128abs_min_f128 ra,rb,rcabsolute minimum f128
f16abs_min_f16 ra,rb,rcabsolute minimum f16
f32abs_min_f32 ra,rb,rcabsolute minimum f32
f64abs_min_f64 ra,rb,rcabsolute minimum f64
f16abs_min_vf16 ra,rb,rcabsolute minimum vf16
f32abs_min_vf32 ra,rb,rcabsolute minimum vf32
f64abs_min_vf64 ra,rb,rcabsolute minimum vf64
f16abs_vf16 ra,rbabsolute difference vf16
f32abs_vf32 ra,rbabsolute difference vf32
f64abs_vf64 ra,rbabsolute difference vf64
baseadd_add_i64 ra,rb,rc,rdadd-add i64
baseadd_addc_u64 ra,rb,rc,rdadd-add with carry-out u64
f16add_alt_vf16 ra,rb,rc,rmadd alternating vf16
f32add_alt_vf32 ra,rb,rc,rmadd alternating vf32
f64add_alt_vf64 ra,rb,rc,rmadd alternating vf64
f128add_f128 ra,rb,rc,rmadd f128
f16add_f16 ra,rb,rc,rmadd f16
f32add_f32 ra,rb,rc,rmadd f32
f64add_f64 ra,rb,rc,rmadd f64
f16add_horiz_vf16 ra,rb,rc,rmadd horizontal vf16
f32add_horiz_vf32 ra,rb,rc,rmadd horizontal vf32
f64add_horiz_vf64 ra,rb,rc,rmadd horizontal vf64
i128add_i128 ra,rb,rcadd i128
baseadd_i32 ra,rb,rcadd i32
baseadd_i64 ra,rb,rcadd i64
i128add_imm_i128.l ra,rb,simm21add immediate i128
baseadd_imm_i32.l ra,rb,simm21add immediate i32
baseadd_imm_i64.l ra,rb,simm21add immediate i64
baseadd_imm_u32.l ra,rb,simm21add immediate u32
mmxadd_sat_vi16 ra,rb,rcadd saturate vi16
mmxadd_sat_vi32 ra,rb,rcadd saturate vi32
mmxadd_sat_vi64 ra,rb,rcadd saturate vi64
mmxadd_sat_vi8 ra,rb,rcadd saturate vi8
mmxadd_sat_vu16 ra,rb,rcadd saturate vu16
mmxadd_sat_vu32 ra,rb,rcadd saturate vu32
mmxadd_sat_vu64 ra,rb,rcadd saturate vu64
mmxadd_sat_vu8 ra,rb,rcadd saturate vu8
baseadd_sub_i64 ra,rb,rc,rdadd-subtract i64
baseadd_u32 ra,rb,rcadd u32
f16add_vf16 ra,rb,rc,rmadd vf16
f32add_vf32 ra,rb,rc,rmadd vf32
f64add_vf64 ra,rb,rc,rmadd vf64
mmxadd_vu16 ra,rb,rcadd vu16
mmxadd_vu32 ra,rb,rcadd vu32
mmxadd_vu64 ra,rb,rcadd vu64
mmxadd_vu8 ra,rb,rcadd vu8
baseaddc_u64 ra,rb,rcadd carry-out u64
mmxaddc_vu16 ra,rb,rcadd carry-out vu16
mmxaddc_vu32 ra,rb,rcadd carry-out vu32
mmxaddc_vu64 ra,rb,rcadd carry-out vu64
mmxaddc_vu8 ra,rb,rcadd carry-out vu8
baseaddo_i64 ra,rb,rcadd overflow i64
mmxaddo_vi16 ra,rb,rcadd overflow vi16
mmxaddo_vi32 ra,rb,rcadd overflow vi32
mmxaddo_vi64 ra,rb,rcadd overflow vi64
mmxaddo_vi8 ra,rb,rcadd overflow vi8
cipheraes_dec ra,rb,rcaes decrypt round
cipheraes_dec_last ra,rb,rcaes decrypt last round
cipheraes_enc ra,rb,rcaes encrypt round
cipheraes_enc_last ra,rb,rcaes encrypt last round
cipheraes_imc ra,rbaes inverse mix columns
cipheraes_keygen_assist ra,rb,simm10aes key generation assist
basealignup_u64 ra,rb,sc,sdalign up shifted
specialalloc framesizeallocate register frame, update eip
specialalloc_sp.l framesize,uimm21allocate register frame, update eip,sp
atomicamo_add_u128 ra,rb,rc,moatomic load-add u128
atomicamo_add_u16 ra,rb,rc,moatomic load-add u16
atomicamo_add_u32 ra,rb,rc,moatomic load-add u32
atomicamo_add_u64 ra,rb,rc,moatomic load-add u64
atomicamo_add_u8 ra,rb,rc,moatomic load-add u8
atomicamo_and_u128 ra,rb,rc,moatomic load-and u128
atomicamo_and_u16 ra,rb,rc,moatomic load-and u16
atomicamo_and_u32 ra,rb,rc,moatomic load-and u32
atomicamo_and_u64 ra,rb,rc,moatomic load-and u64
atomicamo_and_u8 ra,rb,rc,moatomic load-and u8
atomicamo_cas_u128 ra,rb,rc,rd,moatomic compare and swap u128
atomicamo_cas_u16 ra,rb,rc,rd,moatomic compare and swap u16
atomicamo_cas_u32 ra,rb,rc,rd,moatomic compare and swap u32
atomicamo_cas_u64 ra,rb,rc,rd,moatomic compare and swap u64
atomicamo_cas_u8 ra,rb,rc,rd,moatomic compare and swap u8
atomicamo_ld_u128 ra,rb,moatomic load u128
atomicamo_ld_u16 ra,rb,moatomic load u16
atomicamo_ld_u32 ra,rb,moatomic load u32
atomicamo_ld_u64 ra,rb,moatomic load u64
atomicamo_ld_u8 ra,rb,moatomic load u8
atomicamo_max_i128 ra,rb,rc,moatomic load-maximum i128
atomicamo_max_i16 ra,rb,rc,moatomic load-maximum i16
atomicamo_max_i32 ra,rb,rc,moatomic load-maximum i32
atomicamo_max_i64 ra,rb,rc,moatomic load-maximum i64
atomicamo_max_i8 ra,rb,rc,moatomic load-maximum i8
atomicamo_max_u128 ra,rb,rc,moatomic load-maximum u128
atomicamo_max_u16 ra,rb,rc,moatomic load-maximum u16
atomicamo_max_u32 ra,rb,rc,moatomic load-maximum u32
atomicamo_max_u64 ra,rb,rc,moatomic load-maximum u64
atomicamo_max_u8 ra,rb,rc,moatomic load-maximum u8
atomicamo_min_i128 ra,rb,rc,moatomic load-minimum i128
atomicamo_min_i16 ra,rb,rc,moatomic load-minimum i16
atomicamo_min_i32 ra,rb,rc,moatomic load-minimum i32
atomicamo_min_i64 ra,rb,rc,moatomic load-minimum i64
atomicamo_min_i8 ra,rb,rc,moatomic load-minimum i8
atomicamo_min_u128 ra,rb,rc,moatomic load-minimum u128
atomicamo_min_u16 ra,rb,rc,moatomic load-minimum u16
atomicamo_min_u32 ra,rb,rc,moatomic load-minimum u32
atomicamo_min_u64 ra,rb,rc,moatomic load-minimum u64
atomicamo_min_u8 ra,rb,rc,moatomic load-minimum u8
atomicamo_or_u128 ra,rb,rc,moatomic load-or u128
atomicamo_or_u16 ra,rb,rc,moatomic load-or u16
atomicamo_or_u32 ra,rb,rc,moatomic load-or u32
atomicamo_or_u64 ra,rb,rc,moatomic load-or u64
atomicamo_or_u8 ra,rb,rc,moatomic load-or u8
atomicamo_st_u128 ra,rb,moatomic store u128
atomicamo_st_u16 ra,rb,moatomic store u16
atomicamo_st_u32 ra,rb,moatomic store u32
atomicamo_st_u64 ra,rb,moatomic store u64
atomicamo_st_u8 ra,rb,moatomic store u8
atomicamo_sub_u128 ra,rb,rc,moatomic load-sub u128
atomicamo_sub_u16 ra,rb,rc,moatomic load-sub u16
atomicamo_sub_u32 ra,rb,rc,moatomic load-sub u32
atomicamo_sub_u64 ra,rb,rc,moatomic load-sub u64
atomicamo_sub_u8 ra,rb,rc,moatomic load-sub u8
atomicamo_swap_u128 ra,rb,rc,moatomic swap u128
atomicamo_swap_u16 ra,rb,rc,moatomic swap u16
atomicamo_swap_u32 ra,rb,rc,moatomic swap u32
atomicamo_swap_u64 ra,rb,rc,moatomic swap u64
atomicamo_swap_u8 ra,rb,rc,moatomic swap u8
atomicamo_xor_u128 ra,rb,rc,moatomic load-xor u128
atomicamo_xor_u16 ra,rb,rc,moatomic load-xor u16
atomicamo_xor_u32 ra,rb,rc,moatomic load-xor u32
atomicamo_xor_u64 ra,rb,rc,moatomic load-xor u64
atomicamo_xor_u8 ra,rb,rc,moatomic load-xor u8
baseand ra,rb,rcbitwise and
baseand_dec ra,rb,rcbitwise and decremented
baseand_imm.l ra,rb,simm21and bitwise with immediate
baseand_neg ra,rb,rcbitwise and negate
baseandn ra,rb,rcbitwise and-not
baseandn_imm.l ra,rb,simm21bitwise and-not with immediate
mmxavg_vi16 ra,rb,rcaverage vi16
mmxavg_vi32 ra,rb,rcaverage vi32
mmxavg_vi64 ra,rb,rcaverage vi64
mmxavg_vi8 ra,rb,rcaverage vi8
mmxavg_vu16 ra,rb,rcaverage vu16
mmxavg_vu32 ra,rb,rcaverage vu32
mmxavg_vu64 ra,rb,rcaverage vu64
mmxavg_vu8 ra,rb,rcaverage vu8
bitmanipbit_clear ra,rb,rcbit clear
bitmanipbit_clear_imm ra,rb,scbit clear immediate
bitmanipbit_flip ra,rb,rcbit flip
bitmanipbit_flip_imm ra,rb,scbit flip immediate
bitmanipbit_set ra,rb,rcbit set
bitmanipbit_set_imm ra,rb,scbit set immediate
branchbr_bc.l ra,rb,simm17x16branch if bit clear
branchbr_bc_imm.l ra,sb,simm17x16branch if bit clear immediate
branchbr_bs.l ra,rb,simm17x16branch if bit set
branchbr_bs_imm.l ra,sb,simm17x16branch if bit set immediate
branchbr_eq_i128.l ra,rb,simm17x16branch if equal i128
branchbr_eq_i32.l ra,rb,simm17x16branch if equal i32
branchbr_eq_i64.l ra,rb,simm17x16branch if equal i64
branchbr_eq_imm_i128.l ra,simm11,simm17x16branch if equal immediate i128
branchbr_eq_imm_i32.l ra,simm11,simm17x16branch if equal immediate i32
branchbr_eq_imm_i64.l ra,simm11,simm17x16branch if equal immediate i64
branchbr_ge_i128.l ra,rb,simm17x16branch if greater or equal i128
branchbr_ge_i32.l ra,rb,simm17x16branch if greater or equal i32
branchbr_ge_i64.l ra,rb,simm17x16branch if greater or equal i64
branchbr_ge_imm_i128.l ra,simm11,simm17x16branch if greater or equal immediate i128
branchbr_ge_imm_i32.l ra,simm11,simm17x16branch if greater or equal immediate i32
branchbr_ge_imm_i64.l ra,simm11,simm17x16branch if greater or equal immediate i64
branchbr_ge_imm_u128.l ra,uimm11,simm17x16branch if greater or equal immediate u128
branchbr_ge_imm_u32.l ra,uimm11,simm17x16branch if greater or equal immediate u32
branchbr_ge_imm_u64.l ra,uimm11,simm17x16branch if greater or equal immediate u64
branchbr_ge_u128.l ra,rb,simm17x16branch if greater or equal u128
branchbr_ge_u32.l ra,rb,simm17x16branch if greater or equal u32
branchbr_ge_u64.l ra,rb,simm17x16branch if greater or equal u64
branchbr_lt_i128.l ra,rb,simm17x16branch if less i128
branchbr_lt_i32.l ra,rb,simm17x16branch if less i32
branchbr_lt_i64.l ra,rb,simm17x16branch if less i64
branchbr_lt_imm_i128.l ra,simm11,simm17x16branch if less immediate i128
branchbr_lt_imm_i32.l ra,simm11,simm17x16branch if less immediate i32
branchbr_lt_imm_i64.l ra,simm11,simm17x16branch if less immediate i64
branchbr_lt_imm_u128.l ra,uimm11,simm17x16branch if less immediate u128
branchbr_lt_imm_u32.l ra,uimm11,simm17x16branch if less immediate u32
branchbr_lt_imm_u64.l ra,uimm11,simm17x16branch if less immediate u64
branchbr_lt_u128.l ra,rb,simm17x16branch if less u128
branchbr_lt_u32.l ra,rb,simm17x16branch if less u32
branchbr_lt_u64.l ra,rb,simm17x16branch if less u64
branchbr_mask_all.l ra,uimm11,simm17x16branch if mask immediate all bits set
branchbr_mask_any.l ra,uimm11,simm17x16branch if mask immediate any bit set
branchbr_mask_none.l ra,uimm11,simm17x16branch if mask immediate none bit set
branchbr_mask_notall.l ra,uimm11,simm17x16branch if mask immediate not all bits set
branchbr_ne_i128.l ra,rb,simm17x16branch if not equal i128
branchbr_ne_i32.l ra,rb,simm17x16branch if not equal i32
branchbr_ne_i64.l ra,rb,simm17x16branch if not equal i64
branchbr_ne_imm_i128.l ra,simm11,simm17x16branch if not equal immediate i128
branchbr_ne_imm_i32.l ra,simm11,simm17x16branch if not equal immediate i32
branchbr_ne_imm_i64.l ra,simm11,simm17x16branch if not equal immediate i64
branchbr_o_f128.l ra,rb,simm17x16branch if ordered f128
branchbr_o_f32.l ra,rb,simm17x16branch if ordered f32
branchbr_o_f64.l ra,rb,simm17x16branch if ordered f64
branchbr_oeq_f128.l ra,rb,simm17x16branch if ordered and equal f128
branchbr_oeq_f32.l ra,rb,simm17x16branch if ordered and equal f32
branchbr_oeq_f64.l ra,rb,simm17x16branch if ordered and equal f64
branchbr_oge_f128.l ra,rb,simm17x16branch if ordered and greater-or-equal f128
branchbr_oge_f32.l ra,rb,simm17x16branch if ordered and greater-or-equal f32
branchbr_oge_f64.l ra,rb,simm17x16branch if ordered and greater-or-equal f64
branchbr_olt_f128.l ra,rb,simm17x16branch if ordered and less f128
branchbr_olt_f32.l ra,rb,simm17x16branch if ordered and less f32
branchbr_olt_f64.l ra,rb,simm17x16branch if ordered and less f64
branchbr_one_f128.l ra,rb,simm17x16branch if ordered and not-equal f128
branchbr_one_f32.l ra,rb,simm17x16branch if ordered and not-equal f32
branchbr_one_f64.l ra,rb,simm17x16branch if ordered and not-equal f64
branchbr_u_f128.l ra,rb,simm17x16branch if unordered f128
branchbr_u_f32.l ra,rb,simm17x16branch if unordered f32
branchbr_u_f64.l ra,rb,simm17x16branch if unordered f64
branchbr_ueq_f128.l ra,rb,simm17x16branch if unordered or equal f128
branchbr_ueq_f32.l ra,rb,simm17x16branch if unordered or equal f32
branchbr_ueq_f64.l ra,rb,simm17x16branch if unordered or equal f64
branchbr_uge_f128.l ra,rb,simm17x16branch if unordered or greater-or-equal f128
branchbr_uge_f32.l ra,rb,simm17x16branch if unordered or greater-or-equal f32
branchbr_uge_f64.l ra,rb,simm17x16branch if unordered or greater-or-equal f64
branchbr_ult_f128.l ra,rb,simm17x16branch if unordered or less f128
branchbr_ult_f32.l ra,rb,simm17x16branch if unordered or less f32
branchbr_ult_f64.l ra,rb,simm17x16branch if unordered or less f64
branchbr_une_f128.l ra,rb,simm17x16branch if unordered or not-equal f128
branchbr_une_f32.l ra,rb,simm17x16branch if unordered or not-equal f32
branchbr_une_f64.l ra,rb,simm17x16branch if unordered or not-equal f64
jumpcall.l ra,simm28x16call relative
jumpcall_mi.l ra,rb,simm14call memory indirect
jumpcall_plt.l ra,uimm28call procedure linkage table
jumpcall_ri ra,rb,rccall register indirect
jumpcall_rvt.l ra,rb,simm14call relative vtable
f128class_f128 ra,rb,uimm10classify f128
f16class_f16 ra,rb,uimm10classify f16
f32class_f32 ra,rb,uimm10classify f32
f64class_f64 ra,rb,uimm10classify f64
cipherclmul ra,rb,rc,scalecarry-less multiply
i128cmov_eq_i128 ra,rb,rc,rdcond move if equal zero i128
basecmov_eq_i32 ra,rb,rc,rdcond move if equal zero i32
basecmov_eq_i64 ra,rb,rc,rdcond move if equal zero i64
i128cmov_le_i128 ra,rb,rc,rdcond move if less than or equal zero i128
basecmov_le_i32 ra,rb,rc,rdcond move if less than or equal zero i32
basecmov_le_i64 ra,rb,rc,rdcond move if less than or equal zero i64
basecmov_lsb ra,rb,rc,rdcond move if least significand bit
i128cmov_lt_i128 ra,rb,rc,rdcond move if less than zero i128
basecmov_lt_i32 ra,rb,rc,rdcond move if less than zero i32
basecmov_lt_i64 ra,rb,rc,rdcond move if less than zero i64
i128cmp_eq_i128 ra,rb,rccompare equal i128
basecmp_eq_i32 ra,rb,rccompare equal i32
basecmp_eq_i64 ra,rb,rccompare equal i64
i128cmp_eq_imm_i128.l ra,rb,simm21compare equal immediate i128
basecmp_eq_imm_i32.l ra,rb,simm21compare equal immediate i32
basecmp_eq_imm_i64.l ra,rb,simm21compare equal immediate i64
mmxcmp_eq_vi16 ra,rb,rccompare equal vi16
mmxcmp_eq_vi32 ra,rb,rccompare equal vi32
mmxcmp_eq_vi64 ra,rb,rccompare equal vi64
mmxcmp_eq_vi8 ra,rb,rccompare equal vi8
i128cmp_ge_i128 ra,rb,rccompare greater-or-equal i128
basecmp_ge_i32 ra,rb,rccompare greater-or-equal i32
basecmp_ge_i64 ra,rb,rccompare greater-or-equal i64
i128cmp_ge_imm_i128.l ra,rb,simm21compare greater or equal immediate i128
basecmp_ge_imm_i32.l ra,rb,simm21compare greater or equal immediate i32
basecmp_ge_imm_i64.l ra,rb,simm21compare greater or equal immediate i64
i128cmp_ge_imm_u128.l ra,rb,uimm21compare greater or equal immediate u128
basecmp_ge_imm_u32.l ra,rb,uimm21compare greater or equal immediate u32
basecmp_ge_imm_u64.l ra,rb,uimm21compare greater or equal immediate u64
i128cmp_ge_u128 ra,rb,rccompare greater-or-equal u128
basecmp_ge_u32 ra,rb,rccompare greater-or-equal u32
basecmp_ge_u64 ra,rb,rccompare greater-or-equal u64
i128cmp_lt_i128 ra,rb,rccompare less i128
basecmp_lt_i32 ra,rb,rccompare less i32
basecmp_lt_i64 ra,rb,rccompare less i64
i128cmp_lt_imm_i128.l ra,rb,simm21compare less immediate i128
basecmp_lt_imm_i32.l ra,rb,simm21compare less immediate i32
basecmp_lt_imm_i64.l ra,rb,simm21compare less immediate i64
i128cmp_lt_imm_u128.l ra,rb,uimm21compare less immediate u128
basecmp_lt_imm_u32.l ra,rb,uimm21compare less immediate u32
basecmp_lt_imm_u64.l ra,rb,uimm21compare less immediate u64
i128cmp_lt_u128 ra,rb,rccompare less u128
basecmp_lt_u32 ra,rb,rccompare less u32
basecmp_lt_u64 ra,rb,rccompare less u64
mmxcmp_lt_vi16 ra,rb,rccompare less vi16
mmxcmp_lt_vi32 ra,rb,rccompare less vi32
mmxcmp_lt_vi64 ra,rb,rccompare less vi64
mmxcmp_lt_vi8 ra,rb,rccompare less vi8
mmxcmp_lt_vu16 ra,rb,rccompare less vu16
mmxcmp_lt_vu32 ra,rb,rccompare less vu32
mmxcmp_lt_vu64 ra,rb,rccompare less vu64
mmxcmp_lt_vu8 ra,rb,rccompare less vu8
i128cmp_ne_i128 ra,rb,rccompare not equal i128
basecmp_ne_i32 ra,rb,rccompare not equal i32
basecmp_ne_i64 ra,rb,rccompare not equal i64
i128cmp_ne_imm_i128.l ra,rb,simm21compare not equal immediate i128
basecmp_ne_imm_i32.l ra,rb,simm21compare not equal immediate i32
basecmp_ne_imm_i64.l ra,rb,simm21compare not equal immediate i64
f128cmp_o_f128 ra,rb,rccompare ordered f128
f16cmp_o_f16 ra,rb,rccompare ordered f16
f32cmp_o_f32 ra,rb,rccompare ordered f32
f64cmp_o_f64 ra,rb,rccompare ordered f64
f16cmp_o_vf16 ra,rb,rccompare ordered vf16
f32cmp_o_vf32 ra,rb,rccompare ordered vf32
f64cmp_o_vf64 ra,rb,rccompare ordered vf64
f128cmp_oeq_f128 ra,rb,rccompare ordered and equal f128
f16cmp_oeq_f16 ra,rb,rccompare ordered and equal f16
f32cmp_oeq_f32 ra,rb,rccompare ordered and equal f32
f64cmp_oeq_f64 ra,rb,rccompare ordered and equal f64
f16cmp_oeq_vf16 ra,rb,rccompare ordered and equal vf16
f32cmp_oeq_vf32 ra,rb,rccompare ordered and equal vf32
f64cmp_oeq_vf64 ra,rb,rccompare ordered and equal vf64
f128cmp_oge_f128 ra,rb,rccompare ordered and greater-or-equal f128
f16cmp_oge_f16 ra,rb,rccompare ordered and greater-or-equal f16
f32cmp_oge_f32 ra,rb,rccompare ordered and greater-or-equal f32
f64cmp_oge_f64 ra,rb,rccompare ordered and greater-or-equal f64
f16cmp_oge_vf16 ra,rb,rccompare ordered and greater-equal vf16
f32cmp_oge_vf32 ra,rb,rccompare ordered and greater-equal vf32
f64cmp_oge_vf64 ra,rb,rccompare ordered and greater-equal vf64
f128cmp_olt_f128 ra,rb,rccompare ordered and less f128
f16cmp_olt_f16 ra,rb,rccompare ordered and less f16
f32cmp_olt_f32 ra,rb,rccompare ordered and less f32
f64cmp_olt_f64 ra,rb,rccompare ordered and less f64
f16cmp_olt_vf16 ra,rb,rccompare ordered and less vf16
f32cmp_olt_vf32 ra,rb,rccompare ordered and less vf32
f64cmp_olt_vf64 ra,rb,rccompare ordered and less vf64
f128cmp_one_f128 ra,rb,rccompare ordered and not-equal f128
f16cmp_one_f16 ra,rb,rccompare ordered and not-equal f16
f32cmp_one_f32 ra,rb,rccompare ordered and not-equal f32
f64cmp_one_f64 ra,rb,rccompare ordered and not-equal f64
f16cmp_one_vf16 ra,rb,rccompare ordered and not-equal vf16
f32cmp_one_vf32 ra,rb,rccompare ordered and not-equal vf32
f64cmp_one_vf64 ra,rb,rccompare ordered and not-equal vf64
f128cmp_u_f128 ra,rb,rccompare unordered f128
f16cmp_u_f16 ra,rb,rccompare unordered f16
f32cmp_u_f32 ra,rb,rccompare unordered f32
f64cmp_u_f64 ra,rb,rccompare unordered f64
f16cmp_u_vf16 ra,rb,rccompare unordered vf16
f32cmp_u_vf32 ra,rb,rccompare unordered vf32
f64cmp_u_vf64 ra,rb,rccompare unordered vf64
f128cmp_ueq_f128 ra,rb,rccompare unordered or equal f128
f16cmp_ueq_f16 ra,rb,rccompare unordered or equal f16
f32cmp_ueq_f32 ra,rb,rccompare unordered or equal f32
f64cmp_ueq_f64 ra,rb,rccompare unordered or equal f64
f16cmp_ueq_vf16 ra,rb,rccompare unordered or equal vf16
f32cmp_ueq_vf32 ra,rb,rccompare unordered or equal vf32
f64cmp_ueq_vf64 ra,rb,rccompare unordered or equal vf64
f128cmp_uge_f128 ra,rb,rccompare unordered or greater-or-equal f128
f16cmp_uge_f16 ra,rb,rccompare unordered or greater-or-equal f16
f32cmp_uge_f32 ra,rb,rccompare unordered or greater-or-equal f32
f64cmp_uge_f64 ra,rb,rccompare unordered or greater-or-equal f64
f16cmp_uge_vf16 ra,rb,rccompare unordered or greater-equal vf16
f32cmp_uge_vf32 ra,rb,rccompare unordered or greater-equal vf32
f64cmp_uge_vf64 ra,rb,rccompare unordered or greater-equal vf64
f128cmp_ult_f128 ra,rb,rccompare unordered or less f128
f16cmp_ult_f16 ra,rb,rccompare unordered or less f16
f32cmp_ult_f32 ra,rb,rccompare unordered or less f32
f64cmp_ult_f64 ra,rb,rccompare unordered or less f64
f16cmp_ult_vf16 ra,rb,rccompare unordered or less vf16
f32cmp_ult_vf32 ra,rb,rccompare unordered or less vf32
f64cmp_ult_vf64 ra,rb,rccompare unordered or less vf64
f128cmp_une_f128 ra,rb,rccompare unordered or not-equal f128
f16cmp_une_f16 ra,rb,rccompare unordered or not-equal f16
f32cmp_une_f32 ra,rb,rccompare unordered or not-equal f32
f64cmp_une_f64 ra,rb,rccompare unordered or not-equal f64
f16cmp_une_vf16 ra,rb,rccompare unordered or not-equal vf16
f32cmp_une_vf32 ra,rb,rccompare unordered or not-equal vf32
f64cmp_une_vf64 ra,rb,rccompare unordered or not-equal vf64
bitmanipcnt_lz ra,rb,sccount leading zeros
bitmanipcnt_pop ra,rb,sccount population
bitmanipcnt_tz ra,rb,sccount trailing zeros
specialcpuid ra,rb,simm10cpu identification
ciphercrc32c ra,rb,rc,rdcrc32c
f128cvt_f128_f16 ra,rb,rmconvert f128 to f16
f128cvt_f128_f32 ra,rb,rmconvert f128 to f32
f128cvt_f128_f64 ra,rb,rmconvert f128 to f64
f128cvt_f128_i128 ra,rb,rmconvert f128 to i128
f128cvt_f128_i32 ra,rb,rmconvert f128 to i32
f128cvt_f128_i64 ra,rb,rmconvert f128 to i64
f128cvt_f128_u128 ra,rb,rmconvert f128 to u128
f128cvt_f128_u32 ra,rb,rmconvert f128 to u32
f128cvt_f128_u64 ra,rb,rmconvert f128 to u64
f16cvt_f16_i128 ra,rb,rmconvert f16 to i128
f16cvt_f16_i32 ra,rb,rmconvert f16 to i32
f16cvt_f16_i64 ra,rb,rmconvert f16 to i64
f16cvt_f16_u128 ra,rb,rmconvert f16 to u128
f16cvt_f16_u32 ra,rb,rmconvert f16 to u32
f16cvt_f16_u64 ra,rb,rmconvert f16 to u64
f16cvt_f32_f16 ra,rb,rmconvert f32 to f16
f32cvt_f32_i128 ra,rb,rmconvert f32 to i128
f32cvt_f32_i32 ra,rb,rmconvert f32 to i32
f32cvt_f32_i64 ra,rb,rmconvert f32 to i64
f32cvt_f32_u128 ra,rb,rmconvert f32 to u128
f32cvt_f32_u32 ra,rb,rmconvert f32 to u32
f32cvt_f32_u64 ra,rb,rmconvert f32 to u64
f16cvt_f64_f16 ra,rb,rmconvert f64 to f16
f32cvt_f64_f32 ra,rb,rmconvert f64 to f32
f64cvt_f64_i128 ra,rb,rmconvert f64 to i128
f64cvt_f64_i32 ra,rb,rmconvert f64 to i32
f64cvt_f64_i64 ra,rb,rmconvert f64 to i64
f64cvt_f64_u128 ra,rb,rmconvert f64 to u128
f64cvt_f64_u32 ra,rb,rmconvert f64 to u32
f64cvt_f64_u64 ra,rb,rmconvert f64 to u64
f128cvt_i128_f128 ra,rb,rmconvert i128 to f128
f16cvt_i128_f16 ra,rb,rmconvert i128 to f16
f32cvt_i128_f32 ra,rb,rmconvert i128 to f32
f64cvt_i128_f64 ra,rb,rmconvert i128 to f64
f128cvt_i32_f128 ra,rb,rmconvert i32 to f128
f16cvt_i32_f16 ra,rb,rmconvert i32 to f16
f32cvt_i32_f32 ra,rb,rmconvert i32 to f32
f64cvt_i32_f64 ra,rb,rmconvert i32 to f64
f128cvt_i64_f128 ra,rb,rmconvert i64 to f128
f16cvt_i64_f16 ra,rb,rmconvert i64 to f16
f32cvt_i64_f32 ra,rb,rmconvert i64 to f32
f64cvt_i64_f64 ra,rb,rmconvert i64 to f64
f128cvt_u128_f128 ra,rb,rmconvert u128 to f128
f16cvt_u128_f16 ra,rb,rmconvert u128 to f16
f32cvt_u128_f32 ra,rb,rmconvert u128 to f32
f64cvt_u128_f64 ra,rb,rmconvert u128 to f64
f128cvt_u32_f128 ra,rb,rmconvert u32 to f128
f16cvt_u32_f16 ra,rb,rmconvert u32 to f16
f32cvt_u32_f32 ra,rb,rmconvert u32 to f32
f64cvt_u32_f64 ra,rb,rmconvert u32 to f64
f128cvt_u64_f128 ra,rb,rmconvert u64 to f128
f16cvt_u64_f16 ra,rb,rmconvert u64 to f16
f32cvt_u64_f32 ra,rb,rmconvert u64 to f32
f64cvt_u64_f64 ra,rb,rmconvert u64 to f64
f16cvt_vf16_vi16 ra,rb,rmconvert vf16 to vi16
f16cvt_vf16_vu16 ra,rb,rmconvert vf16 to vu16
f32cvt_vf32_vi32 ra,rb,rmconvert vf32 to vi32
f32cvt_vf32_vu32 ra,rb,rmconvert vf32 to vu32
f64cvt_vf64_vi64 ra,rb,rmconvert vf64 to vi64
f64cvt_vf64_vu64 ra,rb,rmconvert vf64 to vu64
f16cvt_vi16_vf16 ra,rb,rmconvert vi16 to vf16
f32cvt_vi32_vf32 ra,rb,rmconvert vi32 to vf32
f64cvt_vi64_vf64 ra,rb,rmconvert vi64 to vf64
f16cvt_vu16_vf16 ra,rb,rmconvert vu16 to vf16
f32cvt_vu32_vf32 ra,rb,rmconvert vu32 to vf32
f64cvt_vu64_vf64 ra,rb,rmconvert vu64 to vf64
specialdcbf.l rb,simm21data cache block flush
privilegeddcbi.l rb,simm21data cache block invalidate
specialdcbt.l rb,simm21data cache block touch
basedeposit ra,rb,rc,sd,sedeposit
basedeposit_r ra,rb,rc,rddeposit register
f128div_f128 ra,rb,rc,rmdivide f128
f16div_f16 ra,rb,rc,rmdivide f16
f32div_f32 ra,rb,rc,rmdivide f32
f64div_f64 ra,rb,rc,rmdivide f64
i128div_i128 ra,rb,rcdivide i128
basediv_i32 ra,rb,rcdivide i32
basediv_i64 ra,rb,rcdivide i64
basediv_imm_i32.l ra,rb,simm21divide i32 immediate
basediv_imm_i64.l ra,rb,simm21divide i64 immediate
basediv_imm_u32.l ra,rb,uimm21divide u32 immediate
basediv_imm_u64.l ra,rb,uimm21divide u64 immediate
i128div_u128 ra,rb,rcdivide u128
basediv_u32 ra,rb,rcdivide u32
basediv_u64 ra,rb,rcdivide u64
f16div_vf16 ra,rb,rc,rmdivide vf16
f32div_vf32 ra,rb,rc,rmdivide vf32
f64div_vf64 ra,rb,rc,rmdivide vf64
i128divp2_i128 ra,rb,rcdivide power-2 i128
basedivp2_i32 ra,rb,rcdivide power-2 i32
basedivp2_i64 ra,rb,rcdivide power-2 i64
i128divp2_imm_i128 ra,rb,scdivide power-2 immediate i128
basedivp2_imm_i32 ra,rb,scdivide power-2 immediate i32
basedivp2_imm_i64 ra,rb,scdivide power-2 immediate i64
f16dot_vf16 ra,rb,rc,rmdot-product vf16
f32dot_vf32 ra,rb,rc,rmdot-product vf32
f64dot_vf64 ra,rb,rc,rmdot-product vf64
specialeh_adj.l simm28x16exception handler adjust eip
specialeh_catch.l rb,simm17x16exception handler catch
specialeh_next.l rb,simm17x16exception handler next
specialeh_throw.l rb,simm21exception handler throw
f128extend_f16_f128 ra,rbextend f16 to f128
f16extend_f16_f32 ra,rbextend f16 to f32
f16extend_f16_f64 ra,rbextend f16 to f64
f128extend_f32_f128 ra,rbextend f32 to f128
f32extend_f32_f64 ra,rbextend f32 to f64
f128extend_f64_f128 ra,rbextend f64 to f128
atomicfence moatomic fence
privilegedget_dbr ra,rb,simm10get data breakpoint register
privilegedget_ibr ra,rb,simm10get instruction breakpoint register
privilegedget_mr ra,rb,simm10get monitor register
specialget_spr ra,sprget special-purpose register
bitmanipgtb ra,rbgraycode to binary
privilegedhalthalt processor
specialicbi.l rb,simm21instruction cache block invalidate
specialint rb,simm10interrupt
jumpjmp.l simm28x16jump relative
specialjmp_mi rb,rc,scalejmpmi
jumpjmp_r rb,rc,scalejump register indirect
jumpjmp_t rb,rcjump table
jumpjmp_t_i32 rb,rcjump table i32 index
jumpjmp_t_u32 rb,rcjump table u32 index
memoryld_i128.l ra,rb,simm21load base i128
memoryld_i16.l ra,rb,simm21load base i16
memoryld_i32.l ra,rb,simm21load base i32
memoryld_i64.l ra,rb,simm21load base i64
memoryld_i8.l ra,rb,simm21load base i8
baseld_imm.l ra,simm28load immediate
f32ld_imm_f32.l ra,fp32load immediate f32*
f64ld_imm_f64.l ra,fp64load immediate f64*
baseld_imm_high.l ra,simm28load immediate high
f128ld_iprel_f128.l ra,fp128load relative f128*
f32ld_iprel_f32.l ra,fp32load relative f32*
f64ld_iprel_f64.l ra,fp64load relative f64*
memoryld_iprel_i128.l ra,uimm28load relative i128
memoryld_iprel_i16.l ra,uimm28load relative i16
memoryld_iprel_i32.l ra,uimm28load relative i32
memoryld_iprel_i64.l ra,uimm28load relative i64
memoryld_iprel_i8.l ra,uimm28load relative i8
memoryld_iprel_u16.l ra,uimm28load relative u16
memoryld_iprel_u32.l ra,uimm28load relative u32
memoryld_iprel_u64.l ra,uimm28load relative u64
memoryld_iprel_u8.l ra,uimm28load relative u8
memoryld_mia_i128 ra,rb,simm10load and modify immediate after i128
memoryld_mia_i16 ra,rb,simm10load and modify immediate after i16
memoryld_mia_i32 ra,rb,simm10load and modify immediate after i32
memoryld_mia_i64 ra,rb,simm10load and modify immediate after i64
memoryld_mia_i8 ra,rb,simm10load and modify immediate after i8
memoryld_mia_u16 ra,rb,simm10load and modify immediate after u16
memoryld_mia_u32 ra,rb,simm10load and modify immediate after u32
memoryld_mia_u64 ra,rb,simm10load and modify immediate after u64
memoryld_mia_u8 ra,rb,simm10load and modify immediate after u8
memoryld_mib_i128 ra,rb,simm10load and modify immediate before i128
memoryld_mib_i16 ra,rb,simm10load and modify immediate before i16
memoryld_mib_i32 ra,rb,simm10load and modify immediate before i32
memoryld_mib_i64 ra,rb,simm10load and modify immediate before i64
memoryld_mib_i8 ra,rb,simm10load and modify immediate before i8
memoryld_mib_u16 ra,rb,simm10load and modify immediate before u16
memoryld_mib_u32 ra,rb,simm10load and modify immediate before u32
memoryld_mib_u64 ra,rb,simm10load and modify immediate before u64
memoryld_mib_u8 ra,rb,simm10load and modify immediate before u8
memoryld_u16.l ra,rb,simm21load base u16
memoryld_u32.l ra,rb,simm21load base u32
memoryld_u64.l ra,rb,simm21load base u64
memoryld_u8.l ra,rb,simm21load base u8
memoryld_xi32_i128.l ra,rb,rc,scale,simm7load i32-indexed i128
memoryld_xi32_i16.l ra,rb,rc,scale,simm7load i32-indexed i16
memoryld_xi32_i32.l ra,rb,rc,scale,simm7load i32-indexed i32
memoryld_xi32_i64.l ra,rb,rc,scale,simm7load i32-indexed i64
memoryld_xi32_i8.l ra,rb,rc,scale,simm7load i32-indexed i8
memoryld_xi32_u16.l ra,rb,rc,scale,simm7load i32-indexed u16
memoryld_xi32_u32.l ra,rb,rc,scale,simm7load i32-indexed u32
memoryld_xi32_u64.l ra,rb,rc,scale,simm7load i32-indexed u64
memoryld_xi32_u8.l ra,rb,rc,scale,simm7load i32-indexed u8
memoryld_xi64_i128.l ra,rb,rc,scale,simm7load i64-indexed i128
memoryld_xi64_i16.l ra,rb,rc,scale,simm7load i64-indexed i16
memoryld_xi64_i32.l ra,rb,rc,scale,simm7load i64-indexed i32
memoryld_xi64_i64.l ra,rb,rc,scale,simm7load i64-indexed i64
memoryld_xi64_i8.l ra,rb,rc,scale,simm7load i64-indexed i8
memoryld_xi64_u16.l ra,rb,rc,scale,simm7load i64-indexed u16
memoryld_xi64_u32.l ra,rb,rc,scale,simm7load i64-indexed u32
memoryld_xi64_u64.l ra,rb,rc,scale,simm7load i64-indexed u64
memoryld_xi64_u8.l ra,rb,rc,scale,simm7load i64-indexed u8
memoryld_xu32_i128.l ra,rb,rc,scale,simm7load u32-indexed i128
memoryld_xu32_i16.l ra,rb,rc,scale,simm7load u32-indexed i16
memoryld_xu32_i32.l ra,rb,rc,scale,simm7load u32-indexed i32
memoryld_xu32_i64.l ra,rb,rc,scale,simm7load u32-indexed i64
memoryld_xu32_i8.l ra,rb,rc,scale,simm7load u32-indexed i8
memoryld_xu32_u16.l ra,rb,rc,scale,simm7load u32-indexed u16
memoryld_xu32_u32.l ra,rb,rc,scale,simm7load u32-indexed u32
memoryld_xu32_u64.l ra,rb,rc,scale,simm7load u32-indexed u64
memoryld_xu32_u8.l ra,rb,rc,scale,simm7load u32-indexed u8
memoryld_xu64_i128.l ra,rb,rc,scale,simm7load u64-indexed i128
memoryld_xu64_i16.l ra,rb,rc,scale,simm7load u64-indexed i16
memoryld_xu64_i32.l ra,rb,rc,scale,simm7load u64-indexed i32
memoryld_xu64_i64.l ra,rb,rc,scale,simm7load u64-indexed i64
memoryld_xu64_i8.l ra,rb,rc,scale,simm7load u64-indexed i8
memoryld_xu64_u16.l ra,rb,rc,scale,simm7load u64-indexed u16
memoryld_xu64_u32.l ra,rb,rc,scale,simm7load u64-indexed u32
memoryld_xu64_u64.l ra,rb,rc,scale,simm7load u64-indexed u64
memoryld_xu64_u8.l ra,rb,rc,scale,simm7load u64-indexed u8
baselda_iprel.l ra,uimm28load address relative forward
baselda_n.l ra,rb,simm14load address near
baselda_nrc.l ra,rb,simm14load address near relative
baselda_r.l ra,simm28x16load address relative
baselda_xi32.l ra,rb,rc,scale,simm7load address i32-indexed
baselda_xi64.l ra,rb,rc,scale,simm7load address i64-indexed
baselda_xu32.l ra,rb,rc,scale,simm7load address u32-indexed
baselda_xu64.l ra,rb,rc,scale,simm7load address u64-indexed
f16madd_alt_vf16 ra,rb,rc,rd,rmmultiply-alternating add-subtract vf16
f32madd_alt_vf32 ra,rb,rc,rd,rmmultiply-alternating add-subtract vf32
f64madd_alt_vf64 ra,rb,rc,rd,rmmultiply-alternating add-subtract vf64
f128madd_f128 ra,rb,rc,rd,rmmultiply-add f128
f16madd_f16 ra,rb,rc,rd,rmmultiply-add f16
f32madd_f32 ra,rb,rc,rd,rmmultiply-add f32
f64madd_f64 ra,rb,rc,rd,rmmultiply-add f64
f16madd_vf16 ra,rb,rc,rd,rmmultiply-add vf16
f32madd_vf32 ra,rb,rc,rd,rmmultiply-add vf32
f64madd_vf64 ra,rb,rc,rd,rmmultiply-add vf64
f128max_f128 ra,rb,rcmaximum f128
f16max_f16 ra,rb,rcmaximum f16
f32max_f32 ra,rb,rcmaximum f32
f64max_f64 ra,rb,rcmaximum f64
i128max_i128 ra,rb,rcmaximum i128
basemax_i32 ra,rb,rcmaximum i32
basemax_i64 ra,rb,rcmaximum i64
basemax_imm_i32.l ra,rb,simm21maximum immediate i32
basemax_imm_i64.l ra,rb,simm21maximum immediate i64
basemax_imm_u32.l ra,rb,uimm21maximum immediate u32
basemax_imm_u64.l ra,rb,uimm21maximum immediate u64
i128max_u128 ra,rb,rcmaximum u128
basemax_u32 ra,rb,rcmaximum u32
basemax_u64 ra,rb,rcmaximum u64
f16max_vf16 ra,rb,rcmaximum vf16
f32max_vf32 ra,rb,rcmaximum vf32
f64max_vf64 ra,rb,rcmaximum vf64
mmxmax_vi16 ra,rb,rcmaximum vi16
mmxmax_vi32 ra,rb,rcmaximum vi32
mmxmax_vi64 ra,rb,rcmaximum vi64
mmxmax_vi8 ra,rb,rcmaximum vi8
mmxmax_vu16 ra,rb,rcmaximum vu16
mmxmax_vu32 ra,rb,rcmaximum vu32
mmxmax_vu64 ra,rb,rcmaximum vu64
mmxmax_vu8 ra,rb,rcmaximum vu8
f128maxnum_f128 ra,rb,rcmaximum number f128
f16maxnum_f16 ra,rb,rcmaximum number f16
f32maxnum_f32 ra,rb,rcmaximum number f32
f64maxnum_f64 ra,rb,rcmaximum number f64
f16maxnum_vf16 ra,rb,rcmaximum number vf16
f32maxnum_vf32 ra,rb,rcmaximum number vf32
f64maxnum_vf64 ra,rb,rcmaximum number vf64
bitmanipmbgath ra,rb,rcmasked bit gather
bitmanipmbscat ra,rb,rcmasked bit scatter
basembsel ra,rb,rc,rdmasked bit selection
f128merge_f128 ra,rb,rc,rdmerge f128
f16merge_f16 ra,rb,rc,rdmerge f16
f32merge_f32 ra,rb,rc,rdmerge f32
f64merge_f64 ra,rb,rc,rdmerge f64
f16merge_high_vf16 ra,rb,rc,rmmerge high parts vf16
f32merge_high_vf32 ra,rb,rc,rmmerge high parts vf32
f64merge_high_vf64 ra,rb,rc,rmmerge high parts vf64
mmxmerge_high_vu16 ra,rb,rcmerge high vu16
mmxmerge_high_vu32 ra,rb,rcmerge high vu32
mmxmerge_high_vu64 ra,rb,rcmerge high vu64
mmxmerge_high_vu8 ra,rb,rcmerge high vu8
f16merge_low_vf16 ra,rb,rc,rmmerge low parts vf16
f32merge_low_vf32 ra,rb,rc,rmmerge low parts vf32
f64merge_low_vf64 ra,rb,rc,rmmerge low parts vf64
mmxmerge_low_vu16 ra,rb,rcmerge low vu16
mmxmerge_low_vu32 ra,rb,rcmerge low vu32
mmxmerge_low_vu64 ra,rb,rcmerge low vu64
mmxmerge_low_vu8 ra,rb,rcmerge low vu8
f16merge_vf16 ra,rb,rc,rdmerge vf16
f32merge_vf32 ra,rb,rc,rdmerge vf32
f64merge_vf64 ra,rb,rc,rdmerge vf64
f128min_f128 ra,rb,rcminimum f128
f16min_f16 ra,rb,rcminimum f16
f32min_f32 ra,rb,rcminimum f32
f64min_f64 ra,rb,rcminimum f64
i128min_i128 ra,rb,rcminimum i128
basemin_i32 ra,rb,rcminimum i32
basemin_i64 ra,rb,rcminimum i64
basemin_imm_i32.l ra,rb,simm21minimum immediate i32
basemin_imm_i64.l ra,rb,simm21minimum immediate i64
basemin_imm_u32.l ra,rb,uimm21minimum immediate u32
basemin_imm_u64.l ra,rb,uimm21minimum immediate u64
i128min_u128 ra,rb,rcminimum u128
basemin_u32 ra,rb,rcminimum u32
basemin_u64 ra,rb,rcminimum u64
f16min_vf16 ra,rb,rcminimum vf16
f32min_vf32 ra,rb,rcminimum vf32
f64min_vf64 ra,rb,rcminimum vf64
mmxmin_vi16 ra,rb,rcminimum vi16
mmxmin_vi32 ra,rb,rcminimum vi32
mmxmin_vi64 ra,rb,rcminimum vi64
mmxmin_vi8 ra,rb,rcminimum vi8
mmxmin_vu16 ra,rb,rcminimum vu16
mmxmin_vu32 ra,rb,rcminimum vu32
mmxmin_vu64 ra,rb,rcminimum vu64
mmxmin_vu8 ra,rb,rcminimum vu8
f128minnum_f128 ra,rb,rcminimum number f128
f16minnum_f16 ra,rb,rcminimum number f16
f32minnum_f32 ra,rb,rcminimum number f32
f64minnum_f64 ra,rb,rcminimum number f64
f16minnum_vf16 ra,rb,rcminimum number vf16
f32minnum_vf32 ra,rb,rcminimum number vf32
f64minnum_vf64 ra,rb,rcminimum number vf64
basemov ra,rbmove general register
basemov2 ra,rb,rc,rdmove 2 general registers
specialmprobe ra,rb,rcmemory probe access
f16msub_alt_vf16 ra,rb,rc,rd,rmmultiply-alternating subtract-add vf16
f32msub_alt_vf32 ra,rb,rc,rd,rmmultiply-alternating subtract-add vf32
f64msub_alt_vf64 ra,rb,rc,rd,rmmultiply-alternating subtract-add vf64
f128msub_f128 ra,rb,rc,rd,rmmultiply-subtract f128
f16msub_f16 ra,rb,rc,rd,rmmultiply-subtract f16
f32msub_f32 ra,rb,rc,rd,rmmultiply-subtract f32
f64msub_f64 ra,rb,rc,rd,rmmultiply-subtract f64
f16msub_vf16 ra,rb,rc,rd,rmmultiply-subtract vf16
f32msub_vf32 ra,rb,rc,rd,rmmultiply-subtract vf32
f64msub_vf64 ra,rb,rc,rd,rmmultiply-subtract vf64
basemul_add ra,rb,rc,rdmultiply-add u64
f128mul_f128 ra,rb,rc,rmmultiply f128
f16mul_f16 ra,rb,rc,rmmultiply f16
f32mul_f32 ra,rb,rc,rmmultiply f32
f64mul_f64 ra,rb,rc,rmmultiply f64
basemul_h ra,rb,rcmultiply high
f16mul_horiz_vf16 ra,rb,rc,rmmultiply horizontal vf16
f32mul_horiz_vf32 ra,rb,rc,rmmultiply horizontal vf32
f64mul_horiz_vf64 ra,rb,rc,rmmultiply horizontal vf64
i128mul_i128 ra,rb,rcmultiply i128
basemul_i32 ra,rb,rcmultiply i32
basemul_i64 ra,rb,rcmultiply i64
basemul_imm_i32.l ra,rb,simm21multiply immediate i32
basemul_imm_i64.l ra,rb,simm21multiply immediate i64
basemul_imm_u32.l ra,rb,uimm21multiply immediate u32
basemul_sub ra,rb,rc,rdmultiply-subtract i64
basemul_subr ra,rb,rc,rdmultiply-subtract reverse i64
basemul_u32 ra,rb,rcmultiply u32
f16mul_vf16 ra,rb,rc,rmmultiply vf16
f32mul_vf32 ra,rb,rc,rmmultiply vf32
f64mul_vf64 ra,rb,rc,rmmultiply vf64
f128nabs_diff_f128 ra,rb,rc,rmnegate absolute difference f128
f16nabs_diff_f16 ra,rb,rc,rmnegate absolute difference f16
f32nabs_diff_f32 ra,rb,rc,rmnegate absolute difference f32
f64nabs_diff_f64 ra,rb,rc,rmnegate absolute difference f64
f16nabs_diff_vf16 ra,rb,rc,rmnegate absolute difference vf16
f32nabs_diff_vf32 ra,rb,rc,rmnegate absolute difference vf32
f64nabs_diff_vf64 ra,rb,rc,rmnegate absolute difference vf64
f128nabs_f128 ra,rbnegate absolute value f128
f16nabs_f16 ra,rbnegate absolute value f16
f32nabs_f32 ra,rbnegate absolute value f32
f64nabs_f64 ra,rbnegate absolute value f64
f16nabs_vf16 ra,rbabsolute value vf16
f32nabs_vf32 ra,rbabsolute value vf32
f64nabs_vf64 ra,rbabsolute value vf64
f128nadd_f128 ra,rb,rc,rmnegate add f128
f16nadd_f16 ra,rb,rc,rmnegate add f16
f32nadd_f32 ra,rb,rc,rmnegate add f32
f64nadd_f64 ra,rb,rc,rmnegate add f64
f16nadd_vf16 ra,rb,rc,rmnegate add vf16
f32nadd_vf32 ra,rb,rc,rmnegate add vf32
f64nadd_vf64 ra,rb,rc,rmnegate add vf64
basenand ra,rb,rcbitwise not-and
f128neg_f128 ra,rbnegate f128
f16neg_f16 ra,rbnegate f16
f32neg_f32 ra,rbnegate f32
f64neg_f64 ra,rbnegate f64
i128neg_i128 ra,rbneg i128
baseneg_i32 ra,rbneg i32
baseneg_i64 ra,rbneg i64
f16neg_vf16 ra,rbnegate vf16
f32neg_vf32 ra,rbnegate vf32
f64neg_vf64 ra,rbnegate vf64
f128nmadd_f128 ra,rb,rc,rd,rmnegate multiply-add f128
f16nmadd_f16 ra,rb,rc,rd,rmnegate multiply-add f16
f32nmadd_f32 ra,rb,rc,rd,rmnegate multiply-add f32
f64nmadd_f64 ra,rb,rc,rd,rmnegate multiply-add f64
f16nmadd_vf16 ra,rb,rc,rd,rmnegate multiply-add vf16
f32nmadd_vf32 ra,rb,rc,rd,rmnegate multiply-add vf32
f64nmadd_vf64 ra,rb,rc,rd,rmnegate multiply-add vf64
f128nmsub_f128 ra,rb,rc,rd,rmnegate multiply-subtract f128
f16nmsub_f16 ra,rb,rc,rd,rmnegate multiply-subtract f16
f32nmsub_f32 ra,rb,rc,rd,rmnegate multiply-subtract f32
f64nmsub_f64 ra,rb,rc,rd,rmnegate multiply-subtract f64
f16nmsub_vf16 ra,rb,rc,rd,rmnegate multiply-subtract vf16
f32nmsub_vf32 ra,rb,rc,rd,rmnegate multiply-subtract vf32
f64nmsub_vf64 ra,rb,rc,rd,rmnegate multiply-subtract vf64
f128nmul_f128 ra,rb,rc,rmnegate multiply f128
f16nmul_f16 ra,rb,rc,rmnegate multiply f16
f32nmul_f32 ra,rb,rc,rmnegate multiply f32
f64nmul_f64 ra,rb,rc,rmnegate multiply f64
f16nmul_vf16 ra,rb,rc,rmnegate multiply vf16
f32nmul_vf32 ra,rb,rc,rmnegate multiply vf32
f64nmul_vf64 ra,rb,rc,rmnegate multiply vf64
basenop.l simm28no operation
basenor ra,rb,rcbitwise not-or
basenot ra,rbbitwise not
nullifyingnul_bc ra,rb,dy,dnnullify if bit clear
nullifyingnul_bc_imm ra,sb,dy,dnnullify if bit clear immediate
nullifyingnul_bs ra,rb,dy,dnnullify if bit set
nullifyingnul_bs_imm ra,sb,dy,dnnullify if bit set immediate
nullifyingnul_eq_i128 ra,rb,dy,dnnullify if equal i128
nullifyingnul_eq_i32 ra,rb,dy,dnnullify if equal i32
nullifyingnul_eq_i64 ra,rb,dy,dnnullify if equal i64
nullifyingnul_eq_imm_i128.l ra,simm11,dy,dnnullify if equal immediate i128
nullifyingnul_eq_imm_i32.l ra,simm11,dy,dnnullify if equal immediate i32
nullifyingnul_eq_imm_i64.l ra,simm11,dy,dnnullify if equal immediate i64
nullifyingnul_ge_i128 ra,rb,dy,dnnullify if greater or equal i128
nullifyingnul_ge_i32 ra,rb,dy,dnnullify if greater or equal i32
nullifyingnul_ge_i64 ra,rb,dy,dnnullify if greater or equal i64
nullifyingnul_ge_imm_i128.l ra,simm11,dy,dnnullify if greater or equal immediate i128
nullifyingnul_ge_imm_i32.l ra,simm11,dy,dnnullify if greater or equal immediate i32
nullifyingnul_ge_imm_i64.l ra,simm11,dy,dnnullify if greater or equal immediate i64
nullifyingnul_ge_imm_u128.l ra,uimm11,dy,dnnullify if greater or equal immediate u128
nullifyingnul_ge_imm_u32.l ra,uimm11,dy,dnnullify if greater or equal immediate u32
nullifyingnul_ge_imm_u64.l ra,uimm11,dy,dnnullify if greater or equal immediate u64
nullifyingnul_ge_u128 ra,rb,dy,dnnullify if greater or equal u128
nullifyingnul_ge_u32 ra,rb,dy,dnnullify if greater or equal u32
nullifyingnul_ge_u64 ra,rb,dy,dnnullify if greater or equal u64
nullifyingnul_lt_i128 ra,rb,dy,dnnullify if less i128
nullifyingnul_lt_i32 ra,rb,dy,dnnullify if less i32
nullifyingnul_lt_i64 ra,rb,dy,dnnullify if less i64
nullifyingnul_lt_imm_i128.l ra,simm11,dy,dnnullify if less immediate i128
nullifyingnul_lt_imm_i32.l ra,simm11,dy,dnnullify if less immediate i32
nullifyingnul_lt_imm_i64.l ra,simm11,dy,dnnullify if less immediate i64
nullifyingnul_lt_imm_u128.l ra,uimm11,dy,dnnullify if less immediate u128
nullifyingnul_lt_imm_u32.l ra,uimm11,dy,dnnullify if less immediate u32
nullifyingnul_lt_imm_u64.l ra,uimm11,dy,dnnullify if less immediate u64
nullifyingnul_lt_u128 ra,rb,dy,dnnullify if less u128
nullifyingnul_lt_u32 ra,rb,dy,dnnullify if less u32
nullifyingnul_lt_u64 ra,rb,dy,dnnullify if less u64
nullifyingnul_mask_all.l ra,uimm11,dy,dnnullify if mask immediate all bits set
nullifyingnul_mask_any.l ra,uimm11,dy,dnnullify if mask immediate any bit set
nullifyingnul_mask_none.l ra,uimm11,dy,dnnullify if mask immediate none bit set
nullifyingnul_mask_notall.l ra,uimm11,dy,dnnullify if mask immediate not all bits set
nullifyingnul_ne_i128 ra,rb,dy,dnnullify if not-equal i128
nullifyingnul_ne_i32 ra,rb,dy,dnnullify if not-equal i32
nullifyingnul_ne_i64 ra,rb,dy,dnnullify if not-equal i64
nullifyingnul_ne_imm_i128.l ra,simm11,dy,dnnullify if not-equal immediate i128
nullifyingnul_ne_imm_i32.l ra,simm11,dy,dnnullify if not-equal immediate i32
nullifyingnul_ne_imm_i64.l ra,simm11,dy,dnnullify if not-equal immediate i64
nullifyingnul_o_f128 ra,rb,dy,dnnullify if ordered f128
nullifyingnul_o_f32 ra,rb,dy,dnnullify if ordered f32
nullifyingnul_o_f64 ra,rb,dy,dnnullify if ordered f64
nullifyingnul_oeq_f128 ra,rb,dy,dnnullify if ordered and equal f128
nullifyingnul_oeq_f32 ra,rb,dy,dnnullify if ordered and equal f32
nullifyingnul_oeq_f64 ra,rb,dy,dnnullify if ordered and equal f64
nullifyingnul_oge_f128 ra,rb,dy,dnnullify if ordered and greater-or-equal f128
nullifyingnul_oge_f32 ra,rb,dy,dnnullify if ordered and greater-or-equal f32
nullifyingnul_oge_f64 ra,rb,dy,dnnullify if ordered and greater-or-equal f64
nullifyingnul_olt_f128 ra,rb,dy,dnnullify if ordered and less f128
nullifyingnul_olt_f32 ra,rb,dy,dnnullify if ordered and less f32
nullifyingnul_olt_f64 ra,rb,dy,dnnullify if ordered and less f64
nullifyingnul_one_f128 ra,rb,dy,dnnullify if ordered and not-equal f128
nullifyingnul_one_f32 ra,rb,dy,dnnullify if ordered and not-equal f32
nullifyingnul_one_f64 ra,rb,dy,dnnullify if ordered and not-equal f64
nullifyingnul_u_f128 ra,rb,dy,dnnullify if unordered f128
nullifyingnul_u_f32 ra,rb,dy,dnnullify if unordered f32
nullifyingnul_u_f64 ra,rb,dy,dnnullify if unordered f64
nullifyingnul_ueq_f128 ra,rb,dy,dnnullify if unordered or equal f128
nullifyingnul_ueq_f32 ra,rb,dy,dnnullify if unordered or equal f32
nullifyingnul_ueq_f64 ra,rb,dy,dnnullify if unordered or equal f64
nullifyingnul_uge_f128 ra,rb,dy,dnnullify if unordered or greater-or-equal f128
nullifyingnul_uge_f32 ra,rb,dy,dnnullify if unordered or greater-or-equal f32
nullifyingnul_uge_f64 ra,rb,dy,dnnullify if unordered or greater-or-equal f64
nullifyingnul_ult_f128 ra,rb,dy,dnnullify if unordered or less f128
nullifyingnul_ult_f32 ra,rb,dy,dnnullify if unordered or less f32
nullifyingnul_ult_f64 ra,rb,dy,dnnullify if unordered or less f64
nullifyingnul_une_f128 ra,rb,dy,dnnullify if unordered or not-equal f128
nullifyingnul_une_f32 ra,rb,dy,dnnullify if unordered or not-equal f32
nullifyingnul_une_f64 ra,rb,dy,dnnullify if unordered or not-equal f64
baseor ra,rb,rcbitwise or
baseor_imm.l ra,rb,simm21bitwise or with immediate
baseorn ra,rb,rcbitwise or-not
baseorn_imm.l ra,rb,simm21bitwise or-not immediate
mmxpack_mod_vu16 ra,rb,rcpack unsigned modulo vu16
mmxpack_mod_vu32 ra,rb,rcpack unsigned modulo vu32
mmxpack_mod_vu64 ra,rb,rcpack unsigned modulo vu64
mmxpack_sat_vi16 ra,rb,rcpack saturated vi16
mmxpack_sat_vi32 ra,rb,rcpack saturated vi32
mmxpack_sat_vi64 ra,rb,rcpack saturated vi64
mmxpack_sat_vu16 ra,rb,rcpack saturated vu16
mmxpack_sat_vu32 ra,rb,rcpack saturated vu32
mmxpack_sat_vu64 ra,rb,rcpack saturated vu64
mmxpack_usat_vi16 ra,rb,rcpack unsigned saturated vi16
mmxpack_usat_vi32 ra,rb,rcpack unsigned saturated vi32
mmxpack_usat_vi64 ra,rb,rcpack unsigned saturated vi64
f16pack_vf16 ra,rb,rcpack vf16
f32pack_vf32 ra,rb,rcpack vf32
f64pack_vf64 ra,rb,rcpack vf64
bitmanipperm ra,rb,rc,rdpermute bytes
bitmanippermb ra,rb,scpermute bits
privilegedptc ra,rb,rcpurge translation cache
specialrandom ra,rbrandom
i128rem_i128 ra,rb,rcremainder i128
baserem_i32 ra,rb,rcremainder i32
baserem_i64 ra,rb,rcremainder i64
baserem_imm_i32.l ra,rb,simm21remainder i32 immediate
baserem_imm_i64.l ra,rb,simm21remainder i64 immediate
baserem_imm_u32.l ra,rb,uimm21remainder u32 immediate
baserem_imm_u64.l ra,rb,uimm21remainder u64 immediate
i128rem_u128 ra,rb,rcremainder u128
baserem_u32 ra,rb,rcremainder u32
baserem_u64 ra,rb,rcremainder u64
jumprep_ge_i32.l ra,rb,uimm6,simm11x16repeat on greater or equal i32
jumprep_ge_i64.l ra,rb,uimm6,simm11x16repeat on greater or equal i64
jumprep_ge_u32.l ra,rb,uimm6,simm11x16repeat on greater or equal u32
jumprep_ge_u64.l ra,rb,uimm6,simm11x16repeat on greater or equal u64
jumprep_gt_i32.l ra,rb,uimm6,simm11x16repeat on greater i32
jumprep_gt_i64.l ra,rb,uimm6,simm11x16repeat on greater i64
jumprep_gt_u32.l ra,rb,uimm6,simm11x16repeat on greater u32
jumprep_gt_u64.l ra,rb,uimm6,simm11x16repeat on greater u64
jumprep_le_i32.l ra,rb,uimm6,simm11x16repeat on less or equal i32
jumprep_le_i64.l ra,rb,uimm6,simm11x16repeat on less or equal i64
jumprep_le_u32.l ra,rb,uimm6,simm11x16repeat on less or equal u32
jumprep_le_u64.l ra,rb,uimm6,simm11x16repeat on less or equal u64
jumprep_lt_i32.l ra,rb,uimm6,simm11x16repeat on less i32
jumprep_lt_i64.l ra,rb,uimm6,simm11x16repeat on less i64
jumprep_lt_u32.l ra,rb,uimm6,simm11x16repeat on less u32
jumprep_lt_u64.l ra,rb,uimm6,simm11x16repeat on less u64
jumpretreturn from subroutine
jumpretf.l uimm21return from subroutine (rollback frame)
privilegedrfireturn from interruption
mmxrol_vu16 ra,rb,rcrotate left vu16
mmxrol_vu32 ra,rb,rcrotate left vu32
mmxrol_vu64 ra,rb,rcrotate left vu64
mmxrol_vu8 ra,rb,rcrotate left vu8
mmxror_vu16 ra,rb,rcrotate right vu16
mmxror_vu32 ra,rb,rcrotate right vu32
mmxror_vu64 ra,rb,rcrotate right vu64
mmxror_vu8 ra,rb,rcrotate right vu8
f128round_f128 ra,rb,rmround f128
f16round_f16 ra,rb,rmround f16
f32round_f32 ra,rb,rmround f32
f64round_f64 ra,rb,rmround f64
f16round_vf16 ra,rb,rmround vf16
f32round_vf32 ra,rb,rmround vf32
f64round_vf64 ra,rb,rmround vf64
f128roundnx_f128 ra,rb,rmround, detect inexact f128
f16roundnx_f16 ra,rb,rmround, detect inexact f16
f32roundnx_f32 ra,rb,rmround, detect inexact f32
f64roundnx_f64 ra,rb,rmround, detect inexact f64
f16roundnx_vf16 ra,rb,rmround, detect inexact vf16
f32roundnx_vf32 ra,rb,rmround, detect inexact vf32
f64roundnx_vf64 ra,rb,rmround, detect inexact vf64
privilegedrscoverregister stack cover
privilegedrsflushregister stack flush
privilegedrsloadregister stack load
f128rsqrt_f128 ra,rb,rmreciprocal square root f128
f16rsqrt_f16 ra,rb,rmreciprocal square root f16
f32rsqrt_f32 ra,rb,rmreciprocal square root f32
f64rsqrt_f64 ra,rb,rmreciprocal square root f64
f16rsqrt_vf16 ra,rb,rmreciprocal square root vf16
f32rsqrt_vf32 ra,rb,rmreciprocal square root vf32
f64rsqrt_vf64 ra,rb,rmreciprocal square root vf64
f128scale_f128 ra,rb,scscale f128
privilegedset_dbr ra,rb,simm10set data breakpoint register
privilegedset_dtr ra,rb,rcset data translation register
privilegedset_ibr ra,rb,simm10set instruction breakpoint register
privilegedset_itr ra,rb,rcset instruction translation register
privilegedset_mr ra,rb,simm10set monitor register
specialset_spr ra,sprset special-purpose register
basesext_i16 ra,rbsign extend i16
basesext_i32 ra,rbsign extend i32
basesext_i64 ra,rbsign extend i64
basesext_i8 ra,rbsign extend i8
basesl_add_i32 ra,rb,rc,sdshift left and add i32
basesl_add_i64 ra,rb,rc,sdshift left and add i64
basesl_add_u32 ra,rb,rc,sdshift left and add u32
basesl_or ra,rb,rc,sdshift left and or
basesl_sub_i32 ra,rb,rc,sdshift left and subtract i32
basesl_sub_i64 ra,rb,rc,sdshift left and subtract i64
basesl_sub_u32 ra,rb,rc,sdshift left and subtract u32
basesl_subr_i32 ra,rb,rc,sdshift left and subtract reverse i32
basesl_subr_i64 ra,rb,rc,sdshift left and subtract reverse i64
basesl_subr_u32 ra,rb,rc,sdshift left and subtract reverse u32
basesl_xor ra,rb,rc,sdshift left and xor
i128sll_imm_u128 ra,rb,scshift left logical immediate u128
basesll_imm_u32 ra,rb,scshift left logical immediate u32
basesll_imm_u64 ra,rb,scshift left logical immediate u64
mmxsll_imm_vu16 ra,rb,scshift left logical immediate vu16
mmxsll_imm_vu32 ra,rb,scshift left logical immediate vu32
mmxsll_imm_vu64 ra,rb,scshift left logical immediate vu64
mmxsll_imm_vu8 ra,rb,scshift left logical immediate vu8
i128sll_u128 ra,rb,rcshift left logical u128
basesll_u32 ra,rb,rcshift left logical u32
basesll_u64 ra,rb,rcshift left logical u64
mmxsll_vu16 ra,rb,rcshift left logical vu16
mmxsll_vu32 ra,rb,rcshift left logical vu32
mmxsll_vu64 ra,rb,rcshift left logical vu64
mmxsll_vu8 ra,rb,rcshift left logical vu8
i128slp_i128 ra,rb,rc,rdshift left pair i128
baseslp_i32 ra,rb,rc,rdshift left pair i32
baseslp_i64 ra,rb,rc,rdshift left pair i64
baseslsra_i32 ra,rb,rc,rdshift left and shift right algebraic i32
baseslsra_i64 ra,rb,rc,rdshift left and shift right algebraic i64
baseslsra_imm_i64 ra,rb,sc,sdshift left and right algebraic immediate i64
baseslsrl_imm_u64 ra,rb,sc,sdshift left and right logical immediate u64
baseslsrl_u32 ra,rb,rc,rdshift left and shift right logical i32
baseslsrl_u64 ra,rb,rc,rdshift left and shift right logical u64
f128sqrt_f128 ra,rb,rmsquare root f128
f16sqrt_f16 ra,rb,rmsquare root f16
f32sqrt_f32 ra,rb,rmsquare root f32
f64sqrt_f64 ra,rb,rmsquare root f64
f16sqrt_vf16 ra,rb,rmsquare root vf16
f32sqrt_vf32 ra,rb,rmsquare root vf32
f64sqrt_vf64 ra,rb,rmsquare root vf64
i128sra_i128 ra,rb,rcshift right algebraic i128
basesra_i32 ra,rb,rcshift right algebraic i32
basesra_i64 ra,rb,rcshift right algebraic i64
i128sra_imm_i128 ra,rb,scshift right algebraic immediate i128
basesra_imm_i32 ra,rb,scshift right algebraic immediate i32
basesra_imm_i64 ra,rb,scshift right algebraic immediate i64
mmxsra_imm_vi16 ra,rb,scshift right algebraic immediate vi16
mmxsra_imm_vi32 ra,rb,scshift right algebraic immediate vi32
mmxsra_imm_vi64 ra,rb,scshift right algebraic immediate vi64
mmxsra_imm_vi8 ra,rb,scshift right algebraic immediate vi8
mmxsra_vi16 ra,rb,rcshift right algebraic vi16
mmxsra_vi32 ra,rb,rcshift right algebraic vi32
mmxsra_vi64 ra,rb,rcshift right algebraic vi64
mmxsra_vi8 ra,rb,rcshift right algebraic vi8
i128srl_imm_u128 ra,rb,scshift right logical immediate u128
basesrl_imm_u32 ra,rb,scshift right logical immediate u32
basesrl_imm_u64 ra,rb,scshift right logical immediate u64
mmxsrl_imm_vu16 ra,rb,scshift right logical immediate vu16
mmxsrl_imm_vu32 ra,rb,scshift right logical immediate vu32
mmxsrl_imm_vu64 ra,rb,scshift right logical immediate vu64
mmxsrl_imm_vu8 ra,rb,scshift right logical immediate vu8
i128srl_u128 ra,rb,rcshift right logical u128
basesrl_u32 ra,rb,rcshift right logical u32
basesrl_u64 ra,rb,rcshift right logical u64
mmxsrl_vu16 ra,rb,rcshift right logical vu16
mmxsrl_vu32 ra,rb,rcshift right logical vu32
mmxsrl_vu64 ra,rb,rcshift right logical vu64
mmxsrl_vu8 ra,rb,rcshift right logical vu8
i128srp_i128 ra,rb,rc,rdshift right pair i128
basesrp_i32 ra,rb,rc,rdshift right pair i32
basesrp_i64 ra,rb,rc,rdshift right pair i64
i128srp_imm_i128 ra,rb,rc,sdshift right pair immediate i128
basesrp_imm_i32 ra,rb,rc,sdshift right pair immediate i32
basesrp_imm_i64 ra,rb,rc,sdshift right pair immediate i64
memoryst_i128.l ra,rb,simm21store base i128
memoryst_i16.l ra,rb,simm21store base i16
memoryst_i32.l ra,rb,simm21store base i32
memoryst_i64.l ra,rb,simm21store base i64
memoryst_i8.l ra,rb,simm21store base i8
memoryst_iprel_i128.l ra,uimm28store relative i128
memoryst_iprel_i16.l ra,uimm28store relative i16
memoryst_iprel_i32.l ra,uimm28store relative i32
memoryst_iprel_i64.l ra,uimm28store relative i64
memoryst_iprel_i8.l ra,uimm28store relative i8
memoryst_mia_i128 ra,rb,simm10store and modify immediate after i128
memoryst_mia_i16 ra,rb,simm10store and modify immediate after i16
memoryst_mia_i32 ra,rb,simm10store and modify immediate after i32
memoryst_mia_i64 ra,rb,simm10store and modify immediate after i64
memoryst_mia_i8 ra,rb,simm10store and modify immediate after i8
memoryst_mib_i128 ra,rb,simm10store and modify immediate before i128
memoryst_mib_i16 ra,rb,simm10store and modify immediate before i16
memoryst_mib_i32 ra,rb,simm10store and modify immediate before i32
memoryst_mib_i64 ra,rb,simm10store and modify immediate before i64
memoryst_mib_i8 ra,rb,simm10store and modify immediate before i8
memoryst_xi32_i128.l ra,rb,rc,scale,simm7store i32-indexed i128
memoryst_xi32_i16.l ra,rb,rc,scale,simm7store i32-indexed i16
memoryst_xi32_i32.l ra,rb,rc,scale,simm7store i32-indexed i32
memoryst_xi32_i64.l ra,rb,rc,scale,simm7store i32-indexed i64
memoryst_xi32_i8.l ra,rb,rc,scale,simm7store i32-indexed i8
memoryst_xi64_i128.l ra,rb,rc,scale,simm7store i64-indexed i128
memoryst_xi64_i16.l ra,rb,rc,scale,simm7store i64-indexed i16
memoryst_xi64_i32.l ra,rb,rc,scale,simm7store i64-indexed i32
memoryst_xi64_i64.l ra,rb,rc,scale,simm7store i64-indexed i64
memoryst_xi64_i8.l ra,rb,rc,scale,simm7store i64-indexed i8
memoryst_xu32_i128.l ra,rb,rc,scale,simm7store u32-indexed i128
memoryst_xu32_i16.l ra,rb,rc,scale,simm7store u32-indexed i16
memoryst_xu32_i32.l ra,rb,rc,scale,simm7store u32-indexed i32
memoryst_xu32_i64.l ra,rb,rc,scale,simm7store u32-indexed i64
memoryst_xu32_i8.l ra,rb,rc,scale,simm7store u32-indexed i8
memoryst_xu64_i128.l ra,rb,rc,scale,simm7store u64-indexed i128
memoryst_xu64_i16.l ra,rb,rc,scale,simm7store u64-indexed i16
memoryst_xu64_i32.l ra,rb,rc,scale,simm7store u64-indexed i32
memoryst_xu64_i64.l ra,rb,rc,scale,simm7store u64-indexed i64
memoryst_xu64_i8.l ra,rb,rc,scale,simm7store u64-indexed i8
f16sub_alt_vf16 ra,rb,rc,rmsubtract alternating vf16
f32sub_alt_vf32 ra,rb,rc,rmsubtract alternating vf32
f64sub_alt_vf64 ra,rb,rc,rmsubtract alternating vf64
f128sub_f128 ra,rb,rc,rmsubtract f128
f16sub_f16 ra,rb,rc,rmsubtract f16
f32sub_f32 ra,rb,rc,rmsubtract f32
f64sub_f64 ra,rb,rc,rmsubtract f64
f16sub_horiz_vf16 ra,rb,rc,rmsubtract horizontal vf16
f32sub_horiz_vf32 ra,rb,rc,rmsubtract horizontal vf32
f64sub_horiz_vf64 ra,rb,rc,rmsubtract horizontal vf64
i128sub_i128 ra,rb,rcsubtract i128
basesub_i32 ra,rb,rcsubtract i32
basesub_i64 ra,rb,rcsubtract i64
mmxsub_sat_vi16 ra,rb,rcsubtract saturated vi16
mmxsub_sat_vi32 ra,rb,rcsubtract saturated vi32
mmxsub_sat_vi64 ra,rb,rcsubtract saturated vi64
mmxsub_sat_vi8 ra,rb,rcsubtract saturated vi8
mmxsub_sat_vu16 ra,rb,rcsubtract saturated vu16
mmxsub_sat_vu32 ra,rb,rcsubtract saturated vu32
mmxsub_sat_vu64 ra,rb,rcsubtract saturated vu64
mmxsub_sat_vu8 ra,rb,rcsubtract saturated vu8
basesub_sub_i64 ra,rb,rc,rdsubtract-subtract i64
basesub_subb_u64 ra,rb,rc,rdsubtract-subtract with borrow-out u64
basesub_u32 ra,rb,rcsubtract u32
f16sub_vf16 ra,rb,rc,rmsubtract vf16
f32sub_vf32 ra,rb,rc,rmsubtract vf32
f64sub_vf64 ra,rb,rc,rmsubtract vf64
mmxsub_vu16 ra,rb,rcsubtract vu16
mmxsub_vu32 ra,rb,rcsubtract vu32
mmxsub_vu64 ra,rb,rcsubtract vu64
mmxsub_vu8 ra,rb,rcsubtract vu8
basesubb_u64 ra,rb,rcsubtract with borrow u64
mmxsubb_vu16 ra,rb,rcsubtract borrow vu16
mmxsubb_vu32 ra,rb,rcsubtract borrow vu32
mmxsubb_vu64 ra,rb,rcsubtract borrow vu64
mmxsubb_vu8 ra,rb,rcsubtract borrow vu8
basesubo_i64 ra,rb,rcsubtract overflow i64
mmxsubo_vi16 ra,rb,rcsubtract overflow vi16
mmxsubo_vi32 ra,rb,rcsubtract overflow vi32
mmxsubo_vi64 ra,rb,rcsubtract overflow vi64
mmxsubo_vi8 ra,rb,rcsubtract overflow vi8
basesubr_imm_i32.l ra,rb,simm21subtract reverse i32
basesubr_imm_i64.l ra,rb,simm21subtract reverse i64
basesubr_imm_u32.l ra,rb,simm21subtract reverse u32
specialsyscallsystem call
privilegedsysretsystem return
privilegedtpa ra,rb,rctranslate to physical address
specialundefundefined instruction
f16unpack_high_vf16 ra,rbunpack high part vf16
f32unpack_high_vf32 ra,rbunpack high part vf32
f64unpack_high_vf64 ra,rbunpack high part vf64
mmxunpack_high_vi16 ra,rbunpack high vi16
mmxunpack_high_vi32 ra,rbunpack high vi32
mmxunpack_high_vi8 ra,rbunpack high vi8
mmxunpack_high_vu16 ra,rbunpack high vu16
mmxunpack_high_vu32 ra,rbunpack high vu32
mmxunpack_high_vu8 ra,rbunpack high vu8
f16unpack_low_vf16 ra,rbunpack low part vf16
f32unpack_low_vf32 ra,rbunpack low part vf32
f64unpack_low_vf64 ra,rbunpack low part vf64
mmxunpack_low_vi16 ra,rbunpack low vi16
mmxunpack_low_vi32 ra,rbunpack low vi32
mmxunpack_low_vi8 ra,rbunpack low vi8
mmxunpack_low_vu16 ra,rbunpack low vu16
mmxunpack_low_vu32 ra,rbunpack low vu32
mmxunpack_low_vu8 ra,rbunpack low vu8
specialwrite.l uimm28write string formatted
basexnor ra,rb,rcbitwise exclusive not-or
basexor ra,rb,rcbitwise exclusive or
basexor_dec ra,rb,rcbitwise exclusive-or decremented
basexor_imm.l ra,rb,simm21bitwise exclusive or with immediate
basezext_i16 ra,rbzero extend i16
basezext_i32 ra,rbzero extend i32
basezext_i64 ra,rbzero extend i64
basezext_i8 ra,rbzero extend i8

Instruction encoding

instruction
mnemonic
bit numbers
41 40 39 38 37 36 35 34 33 32 31 30 29 28 27 26 25 24 23 22 21 20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1 0
abs_diff_f128
0 ra rb rc 1708 rm
abs_diff_f16
0 ra rb rc 1228 rm
abs_diff_f32
0 ra rb rc 1388 rm
abs_diff_f64
0 ra rb rc 1548 rm
abs_diff_i128
0 ra rb rc 306 0
abs_diff_i32
0 ra rb rc 146 0
abs_diff_i64
0 ra rb rc 226 0
abs_diff_vf16
0 ra rb rc 1299 rm
abs_diff_vf32
0 ra rb rc 1459 rm
abs_diff_vf64
0 ra rb rc 1619 rm
abs_f128
0 ra rb 0 1706 0
abs_f16
0 ra rb 0 1226 0
abs_f32
0 ra rb 0 1386 0
abs_f64
0 ra rb 0 1546 0
abs_i128
0 ra rb 0 307 0
abs_i32
0 ra rb 0 147 0
abs_i64
0 ra rb 0 227 0
abs_max_f128
0 ra rb rc 1715 0
abs_max_f16
0 ra rb rc 1235 0
abs_max_f32
0 ra rb rc 1395 0
abs_max_f64
0 ra rb rc 1555 0
abs_max_vf16
0 ra rb rc 1314 0
abs_max_vf32
0 ra rb rc 1474 0
abs_max_vf64
0 ra rb rc 1634 0
abs_min_f128
0 ra rb rc 1714 0
abs_min_f16
0 ra rb rc 1234 0
abs_min_f32
0 ra rb rc 1394 0
abs_min_f64
0 ra rb rc 1554 0
abs_min_vf16
0 ra rb rc 1313 0
abs_min_vf32
0 ra rb rc 1473 0
abs_min_vf64
0 ra rb rc 1633 0
abs_vf16
0 ra rb 0 1297 0
abs_vf32
0 ra rb 0 1457 0
abs_vf64
0 ra rb 0 1617 0
add_add_i64
1 ra rb rc rd 8
add_addc_u64
1 ra rb rc rd 11
add_alt_vf16
0 ra rb rc 1336 rm
add_alt_vf32
0 ra rb rc 1496 rm
add_alt_vf64
0 ra rb rc 1656 rm
add_f128
0 ra rb rc 1699 rm
add_f16
0 ra rb rc 1219 rm
add_f32
0 ra rb rc 1379 rm
add_f64
0 ra rb rc 1539 rm
add_horiz_vf16
0 ra rb rc 1338 rm
add_horiz_vf32
0 ra rb rc 1498 rm
add_horiz_vf64
0 ra rb rc 1658 rm
add_i128
0 ra rb rc 288 0
add_i32
0 ra rb rc 128 0
add_i64
0 ra rb rc 208 0
add_imm_i128
70 ra rb simm21
add_imm_i32
24 ra rb simm21
add_imm_i64
30 ra rb simm21
add_imm_u32
25 ra rb simm21
add_sat_vi16
0 ra rb rc 1867 0
add_sat_vi32
0 ra rb rc 1931 0
add_sat_vi64
0 ra rb rc 1995 0
add_sat_vi8
0 ra rb rc 1803 0
add_sat_vu16
0 ra rb rc 1866 0
add_sat_vu32
0 ra rb rc 1930 0
add_sat_vu64
0 ra rb rc 1994 0
add_sat_vu8
0 ra rb rc 1802 0
add_sub_i64
1 ra rb rc rd 9
add_u32
0 ra rb rc 161 0
add_vf16
0 ra rb rc 1303 rm
add_vf32
0 ra rb rc 1463 rm
add_vf64
0 ra rb rc 1623 rm
add_vu16
0 ra rb rc 1860 0
add_vu32
0 ra rb rc 1924 0
add_vu64
0 ra rb rc 1988 0
add_vu8
0 ra rb rc 1796 0
addc_u64
0 ra rb rc 245 0
addc_vu16
0 ra rb rc 1864 0
addc_vu32
0 ra rb rc 1928 0
addc_vu64
0 ra rb rc 1992 0
addc_vu8
0 ra rb rc 1800 0
addo_i64
0 ra rb rc 243 0
addo_vi16
0 ra rb rc 1862 0
addo_vi32
0 ra rb rc 1926 0
addo_vi64
0 ra rb rc 1990 0
addo_vi8
0 ra rb rc 1798 0
aes_dec
0 ra rb rc 554 0
aes_dec_last
0 ra rb rc 555 0
aes_enc
0 ra rb rc 552 0
aes_enc_last
0 ra rb rc 553 0
aes_imc
0 ra rb 0 556 0
aes_keygen_assist
0 ra rb simm10 557 imm
alignup_u64
1 ra rb sc sd 31
alloc
2 3 framesize 0
alloc_sp
2 4 framesize uimm21
amo_add_u128
0 ra rb rc 931 mo
amo_add_u16
0 ra rb rc 811 mo
amo_add_u32
0 ra rb rc 851 mo
amo_add_u64
0 ra rb rc 891 mo
amo_add_u8
0 ra rb rc 771 mo
amo_and_u128
0 ra rb rc 932 mo
amo_and_u16
0 ra rb rc 812 mo
amo_and_u32
0 ra rb rc 852 mo
amo_and_u64
0 ra rb rc 892 mo
amo_and_u8
0 ra rb rc 772 mo
amo_cas_u128
15 ra rb rc rd 15 mo
amo_cas_u16
15 ra rb rc rd 12 mo
amo_cas_u32
15 ra rb rc rd 13 mo
amo_cas_u64
15 ra rb rc rd 14 mo
amo_cas_u8
15 ra rb rc rd 11 mo
amo_ld_u128
0 ra rb 0 928 mo
amo_ld_u16
0 ra rb 0 808 mo
amo_ld_u32
0 ra rb 0 848 mo
amo_ld_u64
0 ra rb 0 888 mo
amo_ld_u8
0 ra rb 0 768 mo
amo_max_i128
0 ra rb rc 936 mo
amo_max_i16
0 ra rb rc 816 mo
amo_max_i32
0 ra rb rc 856 mo
amo_max_i64
0 ra rb rc 896 mo
amo_max_i8
0 ra rb rc 776 mo
amo_max_u128
0 ra rb rc 938 mo
amo_max_u16
0 ra rb rc 818 mo
amo_max_u32
0 ra rb rc 858 mo
amo_max_u64
0 ra rb rc 898 mo
amo_max_u8
0 ra rb rc 778 mo
amo_min_i128
0 ra rb rc 935 mo
amo_min_i16
0 ra rb rc 815 mo
amo_min_i32
0 ra rb rc 855 mo
amo_min_i64
0 ra rb rc 895 mo
amo_min_i8
0 ra rb rc 775 mo
amo_min_u128
0 ra rb rc 937 mo
amo_min_u16
0 ra rb rc 817 mo
amo_min_u32
0 ra rb rc 857 mo
amo_min_u64
0 ra rb rc 897 mo
amo_min_u8
0 ra rb rc 777 mo
amo_or_u128
0 ra rb rc 933 mo
amo_or_u16
0 ra rb rc 813 mo
amo_or_u32
0 ra rb rc 853 mo
amo_or_u64
0 ra rb rc 893 mo
amo_or_u8
0 ra rb rc 773 mo
amo_st_u128
0 ra rb 0 929 mo
amo_st_u16
0 ra rb 0 809 mo
amo_st_u32
0 ra rb 0 849 mo
amo_st_u64
0 ra rb 0 889 mo
amo_st_u8
0 ra rb 0 769 mo
amo_sub_u128
0 ra rb rc 939 mo
amo_sub_u16
0 ra rb rc 819 mo
amo_sub_u32
0 ra rb rc 859 mo
amo_sub_u64
0 ra rb rc 899 mo
amo_sub_u8
0 ra rb rc 779 mo
amo_swap_u128
0 ra rb rc 930 mo
amo_swap_u16
0 ra rb rc 810 mo
amo_swap_u32
0 ra rb rc 850 mo
amo_swap_u64
0 ra rb rc 890 mo
amo_swap_u8
0 ra rb rc 770 mo
amo_xor_u128
0 ra rb rc 934 mo
amo_xor_u16
0 ra rb rc 814 mo
amo_xor_u32
0 ra rb rc 854 mo
amo_xor_u64
0 ra rb rc 894 mo
amo_xor_u8
0 ra rb rc 774 mo
and
0 ra rb rc 4 0
and_dec
0 ra rb rc 241 0
and_imm
20 ra rb simm21
and_neg
0 ra rb rc 242 0
andn
0 ra rb rc 7 0
andn_imm
18 ra rb simm21
avg_vi16
0 ra rb rc 1870 0
avg_vi32
0 ra rb rc 1934 0
avg_vi64
0 ra rb rc 1998 0
avg_vi8
0 ra rb rc 1806 0
avg_vu16
0 ra rb rc 1871 0
avg_vu32
0 ra rb rc 1935 0
avg_vu64
0 ra rb rc 1999 0
avg_vu8
0 ra rb rc 1807 0
bit_clear
0 ra rb rc 28 0
bit_clear_imm
0 ra rb sc 29 0
bit_flip
0 ra rb rc 32 0
bit_flip_imm
0 ra rb sc 33 0
bit_set
0 ra rb rc 30 0
bit_set_imm
0 ra rb sc 31 0
br_bc
3 ra rb 0 simm17x16
br_bc_imm
3 ra sb 1 simm17x16
br_bs
3 ra rb 2 simm17x16
br_bs_imm
3 ra sb 3 simm17x16
br_eq_i128
7 ra rb 0 simm17x16
br_eq_i32
5 ra rb 0 simm17x16
br_eq_i64
6 ra rb 0 simm17x16
br_eq_imm_i128
118 ra simm11 simm17x16
br_eq_imm_i32
106 ra simm11 simm17x16
br_eq_imm_i64
112 ra simm11 simm17x16
br_ge_i128
7 ra rb 3 simm17x16
br_ge_i32
5 ra rb 3 simm17x16
br_ge_i64
6 ra rb 3 simm17x16
br_ge_imm_i128
121 ra simm11 simm17x16
br_ge_imm_i32
109 ra simm11 simm17x16
br_ge_imm_i64
115 ra simm11 simm17x16
br_ge_imm_u128
123 ra uimm11 simm17x16
br_ge_imm_u32
111 ra uimm11 simm17x16
br_ge_imm_u64
117 ra uimm11 simm17x16
br_ge_u128
7 ra rb 5 simm17x16
br_ge_u32
5 ra rb 5 simm17x16
br_ge_u64
6 ra rb 5 simm17x16
br_lt_i128
7 ra rb 2 simm17x16
br_lt_i32
5 ra rb 2 simm17x16
br_lt_i64
6 ra rb 2 simm17x16
br_lt_imm_i128
120 ra simm11 simm17x16
br_lt_imm_i32
108 ra simm11 simm17x16
br_lt_imm_i64
114 ra simm11 simm17x16
br_lt_imm_u128
122 ra uimm11 simm17x16
br_lt_imm_u32
110 ra uimm11 simm17x16
br_lt_imm_u64
116 ra uimm11 simm17x16
br_lt_u128
7 ra rb 4 simm17x16
br_lt_u32
5 ra rb 4 simm17x16
br_lt_u64
6 ra rb 4 simm17x16
br_mask_all
124 ra uimm11 simm17x16
br_mask_any
127 ra uimm11 simm17x16
br_mask_none
126 ra uimm11 simm17x16
br_mask_notall
125 ra uimm11 simm17x16
br_ne_i128
7 ra rb 1 simm17x16
br_ne_i32
5 ra rb 1 simm17x16
br_ne_i64
6 ra rb 1 simm17x16
br_ne_imm_i128
119 ra simm11 simm17x16
br_ne_imm_i32
107 ra simm11 simm17x16
br_ne_imm_i64
113 ra simm11 simm17x16
br_o_f128
7 ra rb 10 simm17x16
br_o_f32
5 ra rb 10 simm17x16
br_o_f64
6 ra rb 10 simm17x16
br_oeq_f128
7 ra rb 6 simm17x16
br_oeq_f32
5 ra rb 6 simm17x16
br_oeq_f64
6 ra rb 6 simm17x16
br_oge_f128
7 ra rb 9 simm17x16
br_oge_f32
5 ra rb 9 simm17x16
br_oge_f64
6 ra rb 9 simm17x16
br_olt_f128
7 ra rb 8 simm17x16
br_olt_f32
5 ra rb 8 simm17x16
br_olt_f64
6 ra rb 8 simm17x16
br_one_f128
7 ra rb 7 simm17x16
br_one_f32
5 ra rb 7 simm17x16
br_one_f64
6 ra rb 7 simm17x16
br_u_f128
7 ra rb 15 simm17x16
br_u_f32
5 ra rb 15 simm17x16
br_u_f64
6 ra rb 15 simm17x16
br_ueq_f128
7 ra rb 11 simm17x16
br_ueq_f32
5 ra rb 11 simm17x16
br_ueq_f64
6 ra rb 11 simm17x16
br_uge_f128
7 ra rb 14 simm17x16
br_uge_f32
5 ra rb 14 simm17x16
br_uge_f64
6 ra rb 14 simm17x16
br_ult_f128
7 ra rb 13 simm17x16
br_ult_f32
5 ra rb 13 simm17x16
br_ult_f64
6 ra rb 13 simm17x16
br_une_f128
7 ra rb 12 simm17x16
br_une_f32
5 ra rb 12 simm17x16
br_une_f64
6 ra rb 12 simm17x16
call
104 ra simm28x16
call_mi
1 ra rb simm14 100
call_plt
103 ra uimm28
call_ri
0 ra rb rc 16 0
call_rvt
1 ra rb simm14 101
class_f128
0 ra rb uimm10 1696 imm
class_f16
0 ra rb uimm10 1216 imm
class_f32
0 ra rb uimm10 1376 imm
class_f64
0 ra rb uimm10 1536 imm
clmul
0 ra rb rc 544 scale
cmov_eq_i128
1 ra rb rc rd 69
cmov_eq_i32
1 ra rb rc rd 61
cmov_eq_i64
1 ra rb rc rd 65
cmov_le_i128
1 ra rb rc rd 71
cmov_le_i32
1 ra rb rc rd 63
cmov_le_i64
1 ra rb rc rd 67
cmov_lsb
1 ra rb rc rd 60
cmov_lt_i128
1 ra rb rc rd 70
cmov_lt_i32
1 ra rb rc rd 62
cmov_lt_i64
1 ra rb rc rd 66
cmp_eq_i128
0 ra rb rc 300 0
cmp_eq_i32
0 ra rb rc 140 0
cmp_eq_i64
0 ra rb rc 220 0
cmp_eq_imm_i128
64 ra rb simm21
cmp_eq_imm_i32
58 ra rb simm21
cmp_eq_imm_i64
52 ra rb simm21
cmp_eq_vi16
0 ra rb rc 1872 0
cmp_eq_vi32
0 ra rb rc 1936 0
cmp_eq_vi64
0 ra rb rc 2000 0
cmp_eq_vi8
0 ra rb rc 1808 0
cmp_ge_i128
0 ra rb rc 303 0
cmp_ge_i32
0 ra rb rc 143 0
cmp_ge_i64
0 ra rb rc 223 0
cmp_ge_imm_i128
67 ra rb simm21
cmp_ge_imm_i32
61 ra rb simm21
cmp_ge_imm_i64
55 ra rb simm21
cmp_ge_imm_u128
69 ra rb uimm21
cmp_ge_imm_u32
63 ra rb uimm21
cmp_ge_imm_u64
57 ra rb uimm21
cmp_ge_u128
0 ra rb rc 305 0
cmp_ge_u32
0 ra rb rc 145 0
cmp_ge_u64
0 ra rb rc 225 0
cmp_lt_i128
0 ra rb rc 302 0
cmp_lt_i32
0 ra rb rc 142 0
cmp_lt_i64
0 ra rb rc 222 0
cmp_lt_imm_i128
66 ra rb simm21
cmp_lt_imm_i32
60 ra rb simm21
cmp_lt_imm_i64
54 ra rb simm21
cmp_lt_imm_u128
68 ra rb uimm21
cmp_lt_imm_u32
62 ra rb uimm21
cmp_lt_imm_u64
56 ra rb uimm21
cmp_lt_u128
0 ra rb rc 304 0
cmp_lt_u32
0 ra rb rc 144 0
cmp_lt_u64
0 ra rb rc 224 0
cmp_lt_vi16
0 ra rb rc 1873 0
cmp_lt_vi32
0 ra rb rc 1937 0
cmp_lt_vi64
0 ra rb rc 2001 0
cmp_lt_vi8
0 ra rb rc 1809 0
cmp_lt_vu16
0 ra rb rc 1874 0
cmp_lt_vu32
0 ra rb rc 1938 0
cmp_lt_vu64
0 ra rb rc 2002 0
cmp_lt_vu8
0 ra rb rc 1810 0
cmp_ne_i128
0 ra rb rc 301 0
cmp_ne_i32
0 ra rb rc 141 0
cmp_ne_i64
0 ra rb rc 221 0
cmp_ne_imm_i128
65 ra rb simm21
cmp_ne_imm_i32
59 ra rb simm21
cmp_ne_imm_i64
53 ra rb simm21
cmp_o_f128
0 ra rb rc 1724 0
cmp_o_f16
0 ra rb rc 1244 0
cmp_o_f32
0 ra rb rc 1404 0
cmp_o_f64
0 ra rb rc 1564 0
cmp_o_vf16
0 ra rb rc 1328 0
cmp_o_vf32
0 ra rb rc 1488 0
cmp_o_vf64
0 ra rb rc 1648 0
cmp_oeq_f128
0 ra rb rc 1720 0
cmp_oeq_f16
0 ra rb rc 1240 0
cmp_oeq_f32
0 ra rb rc 1400 0
cmp_oeq_f64
0 ra rb rc 1560 0
cmp_oeq_vf16
0 ra rb rc 1324 0
cmp_oeq_vf32
0 ra rb rc 1484 0
cmp_oeq_vf64
0 ra rb rc 1644 0
cmp_oge_f128
0 ra rb rc 1723 0
cmp_oge_f16
0 ra rb rc 1243 0
cmp_oge_f32
0 ra rb rc 1403 0
cmp_oge_f64
0 ra rb rc 1563 0
cmp_oge_vf16
0 ra rb rc 1327 0
cmp_oge_vf32
0 ra rb rc 1487 0
cmp_oge_vf64
0 ra rb rc 1647 0
cmp_olt_f128
0 ra rb rc 1722 0
cmp_olt_f16
0 ra rb rc 1242 0
cmp_olt_f32
0 ra rb rc 1402 0
cmp_olt_f64
0 ra rb rc 1562 0
cmp_olt_vf16
0 ra rb rc 1326 0
cmp_olt_vf32
0 ra rb rc 1486 0
cmp_olt_vf64
0 ra rb rc 1646 0
cmp_one_f128
0 ra rb rc 1721 0
cmp_one_f16
0 ra rb rc 1241 0
cmp_one_f32
0 ra rb rc 1401 0
cmp_one_f64
0 ra rb rc 1561 0
cmp_one_vf16
0 ra rb rc 1325 0
cmp_one_vf32
0 ra rb rc 1485 0
cmp_one_vf64
0 ra rb rc 1645 0
cmp_u_f128
0 ra rb rc 1729 0
cmp_u_f16
0 ra rb rc 1249 0
cmp_u_f32
0 ra rb rc 1409 0
cmp_u_f64
0 ra rb rc 1569 0
cmp_u_vf16
0 ra rb rc 1333 0
cmp_u_vf32
0 ra rb rc 1493 0
cmp_u_vf64
0 ra rb rc 1653 0
cmp_ueq_f128
0 ra rb rc 1725 0
cmp_ueq_f16
0 ra rb rc 1245 0
cmp_ueq_f32
0 ra rb rc 1405 0
cmp_ueq_f64
0 ra rb rc 1565 0
cmp_ueq_vf16
0 ra rb rc 1329 0
cmp_ueq_vf32
0 ra rb rc 1489 0
cmp_ueq_vf64
0 ra rb rc 1649 0
cmp_uge_f128
0 ra rb rc 1728 0
cmp_uge_f16
0 ra rb rc 1248 0
cmp_uge_f32
0 ra rb rc 1408 0
cmp_uge_f64
0 ra rb rc 1568 0
cmp_uge_vf16
0 ra rb rc 1332 0
cmp_uge_vf32
0 ra rb rc 1492 0
cmp_uge_vf64
0 ra rb rc 1652 0
cmp_ult_f128
0 ra rb rc 1727 0
cmp_ult_f16
0 ra rb rc 1247 0
cmp_ult_f32
0 ra rb rc 1407 0
cmp_ult_f64
0 ra rb rc 1567 0
cmp_ult_vf16
0 ra rb rc 1331 0
cmp_ult_vf32
0 ra rb rc 1491 0
cmp_ult_vf64
0 ra rb rc 1651 0
cmp_une_f128
0 ra rb rc 1726 0
cmp_une_f16
0 ra rb rc 1246 0
cmp_une_f32
0 ra rb rc 1406 0
cmp_une_f64
0 ra rb rc 1566 0
cmp_une_vf16
0 ra rb rc 1330 0
cmp_une_vf32
0 ra rb rc 1490 0
cmp_une_vf64
0 ra rb rc 1650 0
cnt_lz
0 ra rb sc 25 0
cnt_pop
0 ra rb sc 24 0
cnt_tz
0 ra rb sc 26 0
cpuid
0 ra rb simm10 513 imm
crc32c
1 ra rb rc rd 32
cvt_f128_f16
0 ra rb 0 1757 rm
cvt_f128_f32
0 ra rb 0 1756 rm
cvt_f128_f64
0 ra rb 0 1755 rm
cvt_f128_i128
0 ra rb 0 1744 rm
cvt_f128_i32
0 ra rb 0 1736 rm
cvt_f128_i64
0 ra rb 0 1740 rm
cvt_f128_u128
0 ra rb 0 1745 rm
cvt_f128_u32
0 ra rb 0 1737 rm
cvt_f128_u64
0 ra rb 0 1741 rm
cvt_f16_i128
0 ra rb 0 1264 rm
cvt_f16_i32
0 ra rb 0 1256 rm
cvt_f16_i64
0 ra rb 0 1260 rm
cvt_f16_u128
0 ra rb 0 1265 rm
cvt_f16_u32
0 ra rb 0 1257 rm
cvt_f16_u64
0 ra rb 0 1261 rm
cvt_f32_f16
0 ra rb 0 1274 rm
cvt_f32_i128
0 ra rb 0 1424 rm
cvt_f32_i32
0 ra rb 0 1416 rm
cvt_f32_i64
0 ra rb 0 1420 rm
cvt_f32_u128
0 ra rb 0 1425 rm
cvt_f32_u32
0 ra rb 0 1417 rm
cvt_f32_u64
0 ra rb 0 1421 rm
cvt_f64_f16
0 ra rb 0 1275 rm
cvt_f64_f32
0 ra rb 0 1433 rm
cvt_f64_i128
0 ra rb 0 1584 rm
cvt_f64_i32
0 ra rb 0 1576 rm
cvt_f64_i64
0 ra rb 0 1580 rm
cvt_f64_u128
0 ra rb 0 1585 rm
cvt_f64_u32
0 ra rb 0 1577 rm
cvt_f64_u64
0 ra rb 0 1581 rm
cvt_i128_f128
0 ra rb 0 1746 rm
cvt_i128_f16
0 ra rb 0 1266 rm
cvt_i128_f32
0 ra rb 0 1426 rm
cvt_i128_f64
0 ra rb 0 1586 rm
cvt_i32_f128
0 ra rb 0 1738 rm
cvt_i32_f16
0 ra rb 0 1258 rm
cvt_i32_f32
0 ra rb 0 1418 rm
cvt_i32_f64
0 ra rb 0 1578 rm
cvt_i64_f128
0 ra rb 0 1742 rm
cvt_i64_f16
0 ra rb 0 1262 rm
cvt_i64_f32
0 ra rb 0 1422 rm
cvt_i64_f64
0 ra rb 0 1582 rm
cvt_u128_f128
0 ra rb 0 1747 rm
cvt_u128_f16
0 ra rb 0 1267 rm
cvt_u128_f32
0 ra rb 0 1427 rm
cvt_u128_f64
0 ra rb 0 1587 rm
cvt_u32_f128
0 ra rb 0 1739 rm
cvt_u32_f16
0 ra rb 0 1259 rm
cvt_u32_f32
0 ra rb 0 1419 rm
cvt_u32_f64
0 ra rb 0 1579 rm
cvt_u64_f128
0 ra rb 0 1743 rm
cvt_u64_f16
0 ra rb 0 1263 rm
cvt_u64_f32
0 ra rb 0 1423 rm
cvt_u64_f64
0 ra rb 0 1583 rm
cvt_vf16_vi16
0 ra rb 0 1347 rm
cvt_vf16_vu16
0 ra rb 0 1348 rm
cvt_vf32_vi32
0 ra rb 0 1507 rm
cvt_vf32_vu32
0 ra rb 0 1508 rm
cvt_vf64_vi64
0 ra rb 0 1667 rm
cvt_vf64_vu64
0 ra rb 0 1668 rm
cvt_vi16_vf16
0 ra rb 0 1349 rm
cvt_vi32_vf32
0 ra rb 0 1509 rm
cvt_vi64_vf64
0 ra rb 0 1669 rm
cvt_vu16_vf16
0 ra rb 0 1350 rm
cvt_vu32_vf32
0 ra rb 0 1510 rm
cvt_vu64_vf64
0 ra rb 0 1670 rm
dcbf
2 17 rb simm21
dcbi
2 18 rb simm21
dcbt
2 16 rb simm21
deposit
23 ra rb rc sd se
deposit_r
1 ra rb rc rd 43
div_f128
0 ra rb rc 1704 rm
div_f16
0 ra rb rc 1224 rm
div_f32
0 ra rb rc 1384 rm
div_f64
0 ra rb rc 1544 rm
div_i128
0 ra rb rc 308 0
div_i32
0 ra rb rc 148 0
div_i64
0 ra rb rc 228 0
div_imm_i32
40 ra rb simm21
div_imm_i64
36 ra rb simm21
div_imm_u32
41 ra rb uimm21
div_imm_u64
37 ra rb uimm21
div_u128
0 ra rb rc 309 0
div_u32
0 ra rb rc 149 0
div_u64
0 ra rb rc 229 0
div_vf16
0 ra rb rc 1308 rm
div_vf32
0 ra rb rc 1468 rm
div_vf64
0 ra rb rc 1628 rm
divp2_i128
0 ra rb rc 299 0
divp2_i32
0 ra rb rc 139 0
divp2_i64
0 ra rb rc 219 0
divp2_imm_i128
0 ra rb sc 295 0
divp2_imm_i32
0 ra rb sc 135 0
divp2_imm_i64
0 ra rb sc 215 0
dot_vf16
0 ra rb rc 1341 rm
dot_vf32
0 ra rb rc 1501 rm
dot_vf64
0 ra rb rc 1661 rm
eh_adj
2 8 simm28x16
eh_catch
2 10 rb 0 simm17x16
eh_next
2 11 rb 0 simm17x16
eh_throw
2 9 rb simm21
extend_f16_f128
0 ra rb 0 1754 0
extend_f16_f32
0 ra rb 0 1272 0
extend_f16_f64
0 ra rb 0 1273 0
extend_f32_f128
0 ra rb 0 1752 0
extend_f32_f64
0 ra rb 0 1432 0
extend_f64_f128
0 ra rb 0 1753 0
fence
0 0 564 mo
get_dbr
0 ra rb simm10 520 imm
get_ibr
0 ra rb simm10 522 imm
get_mr
0 ra rb simm10 524 imm
get_spr
0 ra 0 spr 517 0
gtb
0 ra rb 0 22 0
halt
0 0 528 0
icbi
2 19 rb simm21
int
0 0 rb simm10 514 imm
jmp
2 1 simm28x16
jmp_mi
0 0 rb rc 563 scale
jmp_r
0 0 rb rc 12 scale
jmp_t
0 0 rb rc 13 0
jmp_t_i32
0 0 rb rc 14 0
jmp_t_u32
0 0 rb rc 15 0
ld_i128
84 ra rb simm21
ld_i16
76 ra rb simm21
ld_i32
79 ra rb simm21
ld_i64
82 ra rb simm21
ld_i8
73 ra rb simm21
ld_imm
16 ra simm28
ld_imm_f32
16 ra simm28
ld_imm_f64
16 ra simm28
ld_imm_high
17 ra simm28
ld_iprel_f128
100 ra uimm28
ld_iprel_f32
94 ra uimm28
ld_iprel_f64
97 ra uimm28
ld_iprel_i128
100 ra uimm28
ld_iprel_i16
92 ra uimm28
ld_iprel_i32
95 ra uimm28
ld_iprel_i64
98 ra uimm28
ld_iprel_i8
89 ra uimm28
ld_iprel_u16
91 ra uimm28
ld_iprel_u32
94 ra uimm28
ld_iprel_u64
97 ra uimm28
ld_iprel_u8
88 ra uimm28
ld_mia_i128
0 ra rb simm10 108 imm
ld_mia_i16
0 ra rb simm10 100 imm
ld_mia_i32
0 ra rb simm10 103 imm
ld_mia_i64
0 ra rb simm10 106 imm
ld_mia_i8
0 ra rb simm10 97 imm
ld_mia_u16
0 ra rb simm10 99 imm
ld_mia_u32
0 ra rb simm10 102 imm
ld_mia_u64
0 ra rb simm10 105 imm
ld_mia_u8
0 ra rb simm10 96 imm
ld_mib_i128
0 ra rb simm10 124 imm
ld_mib_i16
0 ra rb simm10 116 imm
ld_mib_i32
0 ra rb simm10 119 imm
ld_mib_i64
0 ra rb simm10 122 imm
ld_mib_i8
0 ra rb simm10 113 imm
ld_mib_u16
0 ra rb simm10 115 imm
ld_mib_u32
0 ra rb simm10 118 imm
ld_mib_u64
0 ra rb simm10 121 imm
ld_mib_u8
0 ra rb simm10 112 imm
ld_u16
75 ra rb simm21
ld_u32
78 ra rb simm21
ld_u64
81 ra rb simm21
ld_u8
72 ra rb simm21
ld_xi32_i128
9 ra rb rc simm7 12 scale
ld_xi32_i16
9 ra rb rc simm7 4 scale
ld_xi32_i32
9 ra rb rc simm7 7 scale
ld_xi32_i64
9 ra rb rc simm7 10 scale
ld_xi32_i8
9 ra rb rc simm7 1 scale
ld_xi32_u16
9 ra rb rc simm7 3 scale
ld_xi32_u32
9 ra rb rc simm7 6 scale
ld_xi32_u64
9 ra rb rc simm7 9 scale
ld_xi32_u8
9 ra rb rc simm7 0 scale
ld_xi64_i128
8 ra rb rc simm7 12 scale
ld_xi64_i16
8 ra rb rc simm7 4 scale
ld_xi64_i32
8 ra rb rc simm7 7 scale
ld_xi64_i64
8 ra rb rc simm7 10 scale
ld_xi64_i8
8 ra rb rc simm7 1 scale
ld_xi64_u16
8 ra rb rc simm7 3 scale
ld_xi64_u32
8 ra rb rc simm7 6 scale
ld_xi64_u64
8 ra rb rc simm7 9 scale
ld_xi64_u8
8 ra rb rc simm7 0 scale
ld_xu32_i128
10 ra rb rc simm7 12 scale
ld_xu32_i16
10 ra rb rc simm7 4 scale
ld_xu32_i32
10 ra rb rc simm7 7 scale
ld_xu32_i64
10 ra rb rc simm7 10 scale
ld_xu32_i8
10 ra rb rc simm7 1 scale
ld_xu32_u16
10 ra rb rc simm7 3 scale
ld_xu32_u32
10 ra rb rc simm7 6 scale
ld_xu32_u64
10 ra rb rc simm7 9 scale
ld_xu32_u8
10 ra rb rc simm7 0 scale
ld_xu64_i128
11 ra rb rc simm7 12 scale
ld_xu64_i16
11 ra rb rc simm7 4 scale
ld_xu64_i32
11 ra rb rc simm7 7 scale
ld_xu64_i64
11 ra rb rc simm7 10 scale
ld_xu64_i8
11 ra rb rc simm7 1 scale
ld_xu64_u16
11 ra rb rc simm7 3 scale
ld_xu64_u32
11 ra rb rc simm7 6 scale
ld_xu64_u64
11 ra rb rc simm7 9 scale
ld_xu64_u8
11 ra rb rc simm7 0 scale
lda_iprel
102 ra uimm28
lda_n
1 ra rb simm14 102
lda_nrc
1 ra rb simm14 103
lda_r
105 ra simm28x16
lda_xi32
9 ra rb rc simm7 14 scale
lda_xi64
8 ra rb rc simm7 14 scale
lda_xu32
10 ra rb rc simm7 14 scale
lda_xu64
11 ra rb rc simm7 14 scale
madd_alt_vf16
12 ra rb rc rd 8 rm
madd_alt_vf32
13 ra rb rc rd 8 rm
madd_alt_vf64
14 ra rb rc rd 8 rm
madd_f128
15 ra rb rc rd 0 rm
madd_f16
12 ra rb rc rd 0 rm
madd_f32
13 ra rb rc rd 0 rm
madd_f64
14 ra rb rc rd 0 rm
madd_vf16
12 ra rb rc rd 4 rm
madd_vf32
13 ra rb rc rd 4 rm
madd_vf64
14 ra rb rc rd 4 rm
max_f128
0 ra rb rc 1711 0
max_f16
0 ra rb rc 1231 0
max_f32
0 ra rb rc 1391 0
max_f64
0 ra rb rc 1551 0
max_i128
0 ra rb rc 312 0
max_i32
0 ra rb rc 152 0
max_i64
0 ra rb rc 232 0
max_imm_i32
48 ra rb simm21
max_imm_i64
44 ra rb simm21
max_imm_u32
49 ra rb uimm21
max_imm_u64
45 ra rb uimm21
max_u128
0 ra rb rc 313 0
max_u32
0 ra rb rc 153 0
max_u64
0 ra rb rc 233 0
max_vf16
0 ra rb rc 1310 0
max_vf32
0 ra rb rc 1470 0
max_vf64
0 ra rb rc 1630 0
max_vi16
0 ra rb rc 1856 0
max_vi32
0 ra rb rc 1920 0
max_vi64
0 ra rb rc 1984 0
max_vi8
0 ra rb rc 1792 0
max_vu16
0 ra rb rc 1857 0
max_vu32
0 ra rb rc 1921 0
max_vu64
0 ra rb rc 1985 0
max_vu8
0 ra rb rc 1793 0
maxnum_f128
0 ra rb rc 1713 0
maxnum_f16
0 ra rb rc 1233 0
maxnum_f32
0 ra rb rc 1393 0
maxnum_f64
0 ra rb rc 1553 0
maxnum_vf16
0 ra rb rc 1312 0
maxnum_vf32
0 ra rb rc 1472 0
maxnum_vf64
0 ra rb rc 1632 0
mbgath
0 ra rb rc 20 0
mbscat
0 ra rb rc 21 0
mbsel
1 ra rb rc rd 28
merge_f128
1 ra rb rc rd 78
merge_f16
1 ra rb rc rd 75
merge_f32
1 ra rb rc rd 76
merge_f64
1 ra rb rc rd 77
merge_high_vf16
0 ra rb rc 1343 rm
merge_high_vf32
0 ra rb rc 1503 rm
merge_high_vf64
0 ra rb rc 1663 rm
merge_high_vu16
0 ra rb rc 1883 0
merge_high_vu32
0 ra rb rc 1947 0
merge_high_vu64
0 ra rb rc 2011 0
merge_high_vu8
0 ra rb rc 1819 0
merge_low_vf16
0 ra rb rc 1342 rm
merge_low_vf32
0 ra rb rc 1502 rm
merge_low_vf64
0 ra rb rc 1662 rm
merge_low_vu16
0 ra rb rc 1884 0
merge_low_vu32
0 ra rb rc 1948 0
merge_low_vu64
0 ra rb rc 2012 0
merge_low_vu8
0 ra rb rc 1820 0
merge_vf16
1 ra rb rc rd 79
merge_vf32
1 ra rb rc rd 80
merge_vf64
1 ra rb rc rd 81
min_f128
0 ra rb rc 1710 0
min_f16
0 ra rb rc 1230 0
min_f32
0 ra rb rc 1390 0
min_f64
0 ra rb rc 1550 0
min_i128
0 ra rb rc 314 0
min_i32
0 ra rb rc 154 0
min_i64
0 ra rb rc 234 0
min_imm_i32
50 ra rb simm21
min_imm_i64
46 ra rb simm21
min_imm_u32
51 ra rb uimm21
min_imm_u64
47 ra rb uimm21
min_u128
0 ra rb rc 315 0
min_u32
0 ra rb rc 155 0
min_u64
0 ra rb rc 235 0
min_vf16
0 ra rb rc 1309 0
min_vf32
0 ra rb rc 1469 0
min_vf64
0 ra rb rc 1629 0
min_vi16
0 ra rb rc 1858 0
min_vi32
0 ra rb rc 1922 0
min_vi64
0 ra rb rc 1986 0
min_vi8
0 ra rb rc 1794 0
min_vu16
0 ra rb rc 1859 0
min_vu32
0 ra rb rc 1923 0
min_vu64
0 ra rb rc 1987 0
min_vu8
0 ra rb rc 1795 0
minnum_f128
0 ra rb rc 1712 0
minnum_f16
0 ra rb rc 1232 0
minnum_f32
0 ra rb rc 1392 0
minnum_f64
0 ra rb rc 1552 0
minnum_vf16
0 ra rb rc 1311 0
minnum_vf32
0 ra rb rc 1471 0
minnum_vf64
0 ra rb rc 1631 0
mov
0 ra rb 0 1 0
mov2
1 ra rb rc rd 30
mprobe
0 ra rb rc 512 0
msub_alt_vf16
12 ra rb rc rd 9 rm
msub_alt_vf32
13 ra rb rc rd 9 rm
msub_alt_vf64
14 ra rb rc rd 9 rm
msub_f128
15 ra rb rc rd 1 rm
msub_f16
12 ra rb rc rd 1 rm
msub_f32
13 ra rb rc rd 1 rm
msub_f64
14 ra rb rc rd 1 rm
msub_vf16
12 ra rb rc rd 5 rm
msub_vf32
13 ra rb rc rd 5 rm
msub_vf64
14 ra rb rc rd 5 rm
mul_add
1 ra rb rc rd 72
mul_f128
0 ra rb rc 1702 rm
mul_f16
0 ra rb rc 1222 rm
mul_f32
0 ra rb rc 1382 rm
mul_f64
0 ra rb rc 1542 rm
mul_h
0 ra rb rc 247 0
mul_horiz_vf16
0 ra rb rc 1340 rm
mul_horiz_vf32
0 ra rb rc 1500 rm
mul_horiz_vf64
0 ra rb rc 1660 rm
mul_i128
0 ra rb rc 290 0
mul_i32
0 ra rb rc 130 0
mul_i64
0 ra rb rc 210 0
mul_imm_i32
28 ra rb simm21
mul_imm_i64
32 ra rb simm21
mul_imm_u32
29 ra rb uimm21
mul_sub
1 ra rb rc rd 73
mul_subr
1 ra rb rc rd 74
mul_u32
0 ra rb rc 160 0
mul_vf16
0 ra rb rc 1306 rm
mul_vf32
0 ra rb rc 1466 rm
mul_vf64
0 ra rb rc 1626 rm
nabs_diff_f128
0 ra rb rc 1709 rm
nabs_diff_f16
0 ra rb rc 1229 rm
nabs_diff_f32
0 ra rb rc 1389 rm
nabs_diff_f64
0 ra rb rc 1549 rm
nabs_diff_vf16
0 ra rb rc 1300 rm
nabs_diff_vf32
0 ra rb rc 1460 rm
nabs_diff_vf64
0 ra rb rc 1620 rm
nabs_f128
0 ra rb 0 1707 0
nabs_f16
0 ra rb 0 1227 0
nabs_f32
0 ra rb 0 1387 0
nabs_f64
0 ra rb 0 1547 0
nabs_vf16
0 ra rb 0 1298 0
nabs_vf32
0 ra rb 0 1458 0
nabs_vf64
0 ra rb 0 1618 0
nadd_f128
0 ra rb rc 1701 rm
nadd_f16
0 ra rb rc 1221 rm
nadd_f32
0 ra rb rc 1381 rm
nadd_f64
0 ra rb rc 1541 rm
nadd_vf16
0 ra rb rc 1305 rm
nadd_vf32
0 ra rb rc 1465 rm
nadd_vf64
0 ra rb rc 1625 rm
nand
0 ra rb rc 8 0
neg_f128
0 ra rb 0 1705 0
neg_f16
0 ra rb 0 1225 0
neg_f32
0 ra rb 0 1385 0
neg_f64
0 ra rb 0 1545 0
neg_i128
0 ra rb 0 291 0
neg_i32
0 ra rb 0 131 0
neg_i64
0 ra rb 0 211 0
neg_vf16
0 ra rb 0 1296 0
neg_vf32
0 ra rb 0 1456 0
neg_vf64
0 ra rb 0 1616 0
nmadd_f128
15 ra rb rc rd 2 rm
nmadd_f16
12 ra rb rc rd 2 rm
nmadd_f32
13 ra rb rc rd 2 rm
nmadd_f64
14 ra rb rc rd 2 rm
nmadd_vf16
12 ra rb rc rd 6 rm
nmadd_vf32
13 ra rb rc rd 6 rm
nmadd_vf64
14 ra rb rc rd 6 rm
nmsub_f128
15 ra rb rc rd 3 rm
nmsub_f16
12 ra rb rc rd 3 rm
nmsub_f32
13 ra rb rc rd 3 rm
nmsub_f64
14 ra rb rc rd 3 rm
nmsub_vf16
12 ra rb rc rd 7 rm
nmsub_vf32
13 ra rb rc rd 7 rm
nmsub_vf64
14 ra rb rc rd 7 rm
nmul_f128
0 ra rb rc 1703 rm
nmul_f16
0 ra rb rc 1223 rm
nmul_f32
0 ra rb rc 1383 rm
nmul_f64
0 ra rb rc 1543 rm
nmul_vf16
0 ra rb rc 1307 rm
nmul_vf32
0 ra rb rc 1467 rm
nmul_vf64
0 ra rb rc 1627 rm
nop
2 0 simm28
nor
0 ra rb rc 9 0
not
0 ra rb 0 3 0
nul_bc
1 ra rb 0 dn dy 0
nul_bc_imm
1 ra sb 1 dn dy 0
nul_bs
1 ra rb 2 dn dy 0
nul_bs_imm
1 ra sb 3 dn dy 0
nul_eq_i128
1 ra rb 0 dn dy 3
nul_eq_i32
1 ra rb 0 dn dy 1
nul_eq_i64
1 ra rb 0 dn dy 2
nul_eq_imm_i128
1 ra simm11 dn dy 118
nul_eq_imm_i32
1 ra simm11 dn dy 106
nul_eq_imm_i64
1 ra simm11 dn dy 112
nul_ge_i128
1 ra rb 3 dn dy 3
nul_ge_i32
1 ra rb 3 dn dy 1
nul_ge_i64
1 ra rb 3 dn dy 2
nul_ge_imm_i128
1 ra simm11 dn dy 121
nul_ge_imm_i32
1 ra simm11 dn dy 109
nul_ge_imm_i64
1 ra simm11 dn dy 115
nul_ge_imm_u128
1 ra uimm11 dn dy 123
nul_ge_imm_u32
1 ra uimm11 dn dy 111
nul_ge_imm_u64
1 ra uimm11 dn dy 117
nul_ge_u128
1 ra rb 5 dn dy 3
nul_ge_u32
1 ra rb 5 dn dy 1
nul_ge_u64
1 ra rb 5 dn dy 2
nul_lt_i128
1 ra rb 2 dn dy 3
nul_lt_i32
1 ra rb 2 dn dy 1
nul_lt_i64
1 ra rb 2 dn dy 2
nul_lt_imm_i128
1 ra simm11 dn dy 120
nul_lt_imm_i32
1 ra simm11 dn dy 108
nul_lt_imm_i64
1 ra simm11 dn dy 114
nul_lt_imm_u128
1 ra uimm11 dn dy 122
nul_lt_imm_u32
1 ra uimm11 dn dy 110
nul_lt_imm_u64
1 ra uimm11 dn dy 116
nul_lt_u128
1 ra rb 4 dn dy 3
nul_lt_u32
1 ra rb 4 dn dy 1
nul_lt_u64
1 ra rb 4 dn dy 2
nul_mask_all
1 ra uimm11 dn dy 124
nul_mask_any
1 ra uimm11 dn dy 127
nul_mask_none
1 ra uimm11 dn dy 126
nul_mask_notall
1 ra uimm11 dn dy 125
nul_ne_i128
1 ra rb 1 dn dy 3
nul_ne_i32
1 ra rb 1 dn dy 1
nul_ne_i64
1 ra rb 1 dn dy 2
nul_ne_imm_i128
1 ra simm11 dn dy 119
nul_ne_imm_i32
1 ra simm11 dn dy 107
nul_ne_imm_i64
1 ra simm11 dn dy 113
nul_o_f128
1 ra rb 10 dn dy 3
nul_o_f32
1 ra rb 10 dn dy 1
nul_o_f64
1 ra rb 10 dn dy 2
nul_oeq_f128
1 ra rb 6 dn dy 3
nul_oeq_f32
1 ra rb 6 dn dy 1
nul_oeq_f64
1 ra rb 6 dn dy 2
nul_oge_f128
1 ra rb 9 dn dy 3
nul_oge_f32
1 ra rb 9 dn dy 1
nul_oge_f64
1 ra rb 9 dn dy 2
nul_olt_f128
1 ra rb 8 dn dy 3
nul_olt_f32
1 ra rb 8 dn dy 1
nul_olt_f64
1 ra rb 8 dn dy 2
nul_one_f128
1 ra rb 7 dn dy 3
nul_one_f32
1 ra rb 7 dn dy 1
nul_one_f64
1 ra rb 7 dn dy 2
nul_u_f128
1 ra rb 15 dn dy 3
nul_u_f32
1 ra rb 15 dn dy 1
nul_u_f64
1 ra rb 15 dn dy 2
nul_ueq_f128
1 ra rb 11 dn dy 3
nul_ueq_f32
1 ra rb 11 dn dy 1
nul_ueq_f64
1 ra rb 11 dn dy 2
nul_uge_f128
1 ra rb 14 dn dy 3
nul_uge_f32
1 ra rb 14 dn dy 1
nul_uge_f64
1 ra rb 14 dn dy 2
nul_ult_f128
1 ra rb 13 dn dy 3
nul_ult_f32
1 ra rb 13 dn dy 1
nul_ult_f64
1 ra rb 13 dn dy 2
nul_une_f128
1 ra rb 12 dn dy 3
nul_une_f32
1 ra rb 12 dn dy 1
nul_une_f64
1 ra rb 12 dn dy 2
or
0 ra rb rc 5 0
or_imm
21 ra rb simm21
orn
0 ra rb rc 11 0
orn_imm
19 ra rb simm21
pack_mod_vu16
0 ra rb rc 1894 0
pack_mod_vu32
0 ra rb rc 1958 0
pack_mod_vu64
0 ra rb rc 2022 0
pack_sat_vi16
0 ra rb rc 1892 0
pack_sat_vi32
0 ra rb rc 1956 0
pack_sat_vi64
0 ra rb rc 2020 0
pack_sat_vu16
0 ra rb rc 1893 0
pack_sat_vu32
0 ra rb rc 1957 0
pack_sat_vu64
0 ra rb rc 2021 0
pack_usat_vi16
0 ra rb rc 1895 0
pack_usat_vi32
0 ra rb rc 1959 0
pack_usat_vi64
0 ra rb rc 2023 0
pack_vf16
0 ra rb rc 1346 0
pack_vf32
0 ra rb rc 1506 0
pack_vf64
0 ra rb rc 1666 0
perm
1 ra rb rc rd 29
permb
0 ra rb sc 27 0
ptc
0 ra rb rc 530 0
random
0 ra rb 0 516 0
rem_i128
0 ra rb rc 310 0
rem_i32
0 ra rb rc 150 0
rem_i64
0 ra rb rc 230 0
rem_imm_i32
42 ra rb simm21
rem_imm_i64
38 ra rb simm21
rem_imm_u32
43 ra rb uimm21
rem_imm_u64
39 ra rb uimm21
rem_u128
0 ra rb rc 311 0
rem_u32
0 ra rb rc 151 0
rem_u64
0 ra rb rc 231 0
rep_ge_i32
4 ra rb 11 uimm6 simm11x16
rep_ge_i64
4 ra rb 3 uimm6 simm11x16
rep_ge_u32
4 ra rb 15 uimm6 simm11x16
rep_ge_u64
4 ra rb 7 uimm6 simm11x16
rep_gt_i32
4 ra rb 9 uimm6 simm11x16
rep_gt_i64
4 ra rb 1 uimm6 simm11x16
rep_gt_u32
4 ra rb 13 uimm6 simm11x16
rep_gt_u64
4 ra rb 5 uimm6 simm11x16
rep_le_i32
4 ra rb 10 uimm6 simm11x16
rep_le_i64
4 ra rb 2 uimm6 simm11x16
rep_le_u32
4 ra rb 14 uimm6 simm11x16
rep_le_u64
4 ra rb 6 uimm6 simm11x16
rep_lt_i32
4 ra rb 8 uimm6 simm11x16
rep_lt_i64
4 ra rb 0 uimm6 simm11x16
rep_lt_u32
4 ra rb 12 uimm6 simm11x16
rep_lt_u64
4 ra rb 4 uimm6 simm11x16
ret
0 0 2 0
retf
2 2 0 uimm21
rfi
0 0 527 0
rol_vu16
0 ra rb rc 1881 0
rol_vu32
0 ra rb rc 1945 0
rol_vu64
0 ra rb rc 2009 0
rol_vu8
0 ra rb rc 1817 0
ror_vu16
0 ra rb rc 1882 0
ror_vu32
0 ra rb rc 1946 0
ror_vu64
0 ra rb rc 2010 0
ror_vu8
0 ra rb rc 1818 0
round_f128
0 ra rb 0 1716 rm
round_f16
0 ra rb 0 1236 rm
round_f32
0 ra rb 0 1396 rm
round_f64
0 ra rb 0 1556 rm
round_vf16
0 ra rb 0 1315 rm
round_vf32
0 ra rb 0 1475 rm
round_vf64
0 ra rb 0 1635 rm
roundnx_f128
0 ra rb 0 1717 rm
roundnx_f16
0 ra rb 0 1237 rm
roundnx_f32
0 ra rb 0 1397 rm
roundnx_f64
0 ra rb 0 1557 rm
roundnx_vf16
0 ra rb 0 1316 rm
roundnx_vf32
0 ra rb 0 1476 rm
roundnx_vf64
0 ra rb 0 1636 rm
rscover
0 0 536 0
rsflush
0 0 537 0
rsload
0 0 538 0
rsqrt_f128
0 ra rb 0 1698 rm
rsqrt_f16
0 ra rb 0 1218 rm
rsqrt_f32
0 ra rb 0 1378 rm
rsqrt_f64
0 ra rb 0 1538 rm
rsqrt_vf16
0 ra rb 0 1301 rm
rsqrt_vf32
0 ra rb 0 1461 rm
rsqrt_vf64
0 ra rb 0 1621 rm
scale_f128
0 ra rb sc 1758 0
set_dbr
0 ra rb simm10 519 imm
set_dtr
0 ra rb rc 526 0
set_ibr
0 ra rb simm10 521 imm
set_itr
0 ra rb rc 525 0
set_mr
0 ra rb simm10 523 imm
set_spr
0 ra 0 spr 518 0
sext_i16
0 ra rb 0 37 0
sext_i32
0 ra rb 0 38 0
sext_i64
0 ra rb 0 39 0
sext_i8
0 ra rb 0 36 0
sl_add_i32
1 ra rb rc sd 37
sl_add_i64
1 ra rb rc sd 23
sl_add_u32
1 ra rb rc sd 38
sl_or
1 ra rb rc sd 50
sl_sub_i32
1 ra rb rc sd 41
sl_sub_i64
1 ra rb rc sd 24
sl_sub_u32
1 ra rb rc sd 42
sl_subr_i32
1 ra rb rc sd 39
sl_subr_i64
1 ra rb rc sd 25
sl_subr_u32
1 ra rb rc sd 40
sl_xor
1 ra rb rc sd 51
sll_imm_u128
0 ra rb sc 292 0
sll_imm_u32
0 ra rb sc 132 0
sll_imm_u64
0 ra rb sc 212 0
sll_imm_vu16
0 ra rb sc 1876 0
sll_imm_vu32
0 ra rb sc 1940 0
sll_imm_vu64
0 ra rb sc 2004 0
sll_imm_vu8
0 ra rb sc 1812 0
sll_u128
0 ra rb rc 296 0
sll_u32
0 ra rb rc 136 0
sll_u64
0 ra rb rc 216 0
sll_vu16
0 ra rb rc 1875 0
sll_vu32
0 ra rb rc 1939 0
sll_vu64
0 ra rb rc 2003 0
sll_vu8
0 ra rb rc 1811 0
slp_i128
1 ra rb rc rd 44
slp_i32
1 ra rb rc rd 13
slp_i64
1 ra rb rc rd 16
slsra_i32
1 ra rb rc rd 27
slsra_i64
1 ra rb rc rd 20
slsra_imm_i64
1 ra rb sc sd 22
slsrl_imm_u64
1 ra rb sc sd 21
slsrl_u32
1 ra rb rc rd 26
slsrl_u64
1 ra rb rc rd 19
sqrt_f128
0 ra rb 0 1697 rm
sqrt_f16
0 ra rb 0 1217 rm
sqrt_f32
0 ra rb 0 1377 rm
sqrt_f64
0 ra rb 0 1537 rm
sqrt_vf16
0 ra rb 0 1302 rm
sqrt_vf32
0 ra rb 0 1462 rm
sqrt_vf64
0 ra rb 0 1622 rm
sra_i128
0 ra rb rc 298 0
sra_i32
0 ra rb rc 138 0
sra_i64
0 ra rb rc 218 0
sra_imm_i128
0 ra rb sc 294 0
sra_imm_i32
0 ra rb sc 134 0
sra_imm_i64
0 ra rb sc 214 0
sra_imm_vi16
0 ra rb sc 1880 0
sra_imm_vi32
0 ra rb sc 1944 0
sra_imm_vi64
0 ra rb sc 2008 0
sra_imm_vi8
0 ra rb sc 1816 0
sra_vi16
0 ra rb rc 1879 0
sra_vi32
0 ra rb rc 1943 0
sra_vi64
0 ra rb rc 2007 0
sra_vi8
0 ra rb rc 1815 0
srl_imm_u128
0 ra rb sc 293 0
srl_imm_u32
0 ra rb sc 133 0
srl_imm_u64
0 ra rb sc 213 0
srl_imm_vu16
0 ra rb sc 1878 0
srl_imm_vu32
0 ra rb sc 1942 0
srl_imm_vu64
0 ra rb sc 2006 0
srl_imm_vu8
0 ra rb sc 1814 0
srl_u128
0 ra rb rc 297 0
srl_u32
0 ra rb rc 137 0
srl_u64
0 ra rb rc 217 0
srl_vu16
0 ra rb rc 1877 0
srl_vu32
0 ra rb rc 1941 0
srl_vu64
0 ra rb rc 2005 0
srl_vu8
0 ra rb rc 1813 0
srp_i128
1 ra rb rc rd 45
srp_i32
1 ra rb rc rd 14
srp_i64
1 ra rb rc rd 17
srp_imm_i128
1 ra rb rc sd 46
srp_imm_i32
1 ra rb rc sd 15
srp_imm_i64
1 ra rb rc sd 18
st_i128
85 ra rb simm21
st_i16
77 ra rb simm21
st_i32
80 ra rb simm21
st_i64
83 ra rb simm21
st_i8
74 ra rb simm21
st_iprel_i128
101 ra uimm28
st_iprel_i16
93 ra uimm28
st_iprel_i32
96 ra uimm28
st_iprel_i64
99 ra uimm28
st_iprel_i8
90 ra uimm28
st_mia_i128
0 ra rb simm10 109 imm
st_mia_i16
0 ra rb simm10 101 imm
st_mia_i32
0 ra rb simm10 104 imm
st_mia_i64
0 ra rb simm10 107 imm
st_mia_i8
0 ra rb simm10 98 imm
st_mib_i128
0 ra rb simm10 125 imm
st_mib_i16
0 ra rb simm10 117 imm
st_mib_i32
0 ra rb simm10 120 imm
st_mib_i64
0 ra rb simm10 123 imm
st_mib_i8
0 ra rb simm10 114 imm
st_xi32_i128
9 ra rb rc simm7 13 scale
st_xi32_i16
9 ra rb rc simm7 5 scale
st_xi32_i32
9 ra rb rc simm7 8 scale
st_xi32_i64
9 ra rb rc simm7 11 scale
st_xi32_i8
9 ra rb rc simm7 2 scale
st_xi64_i128
8 ra rb rc simm7 13 scale
st_xi64_i16
8 ra rb rc simm7 5 scale
st_xi64_i32
8 ra rb rc simm7 8 scale
st_xi64_i64
8 ra rb rc simm7 11 scale
st_xi64_i8
8 ra rb rc simm7 2 scale
st_xu32_i128
10 ra rb rc simm7 13 scale
st_xu32_i16
10 ra rb rc simm7 5 scale
st_xu32_i32
10 ra rb rc simm7 8 scale
st_xu32_i64
10 ra rb rc simm7 11 scale
st_xu32_i8
10 ra rb rc simm7 2 scale
st_xu64_i128
11 ra rb rc simm7 13 scale
st_xu64_i16
11 ra rb rc simm7 5 scale
st_xu64_i32
11 ra rb rc simm7 8 scale
st_xu64_i64
11 ra rb rc simm7 11 scale
st_xu64_i8
11 ra rb rc simm7 2 scale
sub_alt_vf16
0 ra rb rc 1337 rm
sub_alt_vf32
0 ra rb rc 1497 rm
sub_alt_vf64
0 ra rb rc 1657 rm
sub_f128
0 ra rb rc 1700 rm
sub_f16
0 ra rb rc 1220 rm
sub_f32
0 ra rb rc 1380 rm
sub_f64
0 ra rb rc 1540 rm
sub_horiz_vf16
0 ra rb rc 1339 rm
sub_horiz_vf32
0 ra rb rc 1499 rm
sub_horiz_vf64
0 ra rb rc 1659 rm
sub_i128
0 ra rb rc 289 0
sub_i32
0 ra rb rc 129 0
sub_i64
0 ra rb rc 209 0
sub_sat_vi16
0 ra rb rc 1868 0
sub_sat_vi32
0 ra rb rc 1932 0
sub_sat_vi64
0 ra rb rc 1996 0
sub_sat_vi8
0 ra rb rc 1804 0
sub_sat_vu16
0 ra rb rc 1869 0
sub_sat_vu32
0 ra rb rc 1933 0
sub_sat_vu64
0 ra rb rc 1997 0
sub_sat_vu8
0 ra rb rc 1805 0
sub_sub_i64
1 ra rb rc rd 10
sub_subb_u64
1 ra rb rc rd 12
sub_u32
0 ra rb rc 162 0
sub_vf16
0 ra rb rc 1304 rm
sub_vf32
0 ra rb rc 1464 rm
sub_vf64
0 ra rb rc 1624 rm
sub_vu16
0 ra rb rc 1861 0
sub_vu32
0 ra rb rc 1925 0
sub_vu64
0 ra rb rc 1989 0
sub_vu8
0 ra rb rc 1797 0
subb_u64
0 ra rb rc 246 0
subb_vu16
0 ra rb rc 1865 0
subb_vu32
0 ra rb rc 1929 0
subb_vu64
0 ra rb rc 1993 0
subb_vu8
0 ra rb rc 1801 0
subo_i64
0 ra rb rc 244 0
subo_vi16
0 ra rb rc 1863 0
subo_vi32
0 ra rb rc 1927 0
subo_vi64
0 ra rb rc 1991 0
subo_vi8
0 ra rb rc 1799 0
subr_imm_i32
26 ra rb simm21
subr_imm_i64
31 ra rb simm21
subr_imm_u32
27 ra rb simm21
syscall
0 0 515 0
sysret
0 0 534 0
tpa
0 ra rb rc 529 0
undef
0 0 0 0
unpack_high_vf16
0 ra rb 0 1344 0
unpack_high_vf32
0 ra rb 0 1504 0
unpack_high_vf64
0 ra rb 0 1664 0
unpack_high_vi16
0 ra rb 0 1889 0
unpack_high_vi32
0 ra rb 0 1953 0
unpack_high_vi8
0 ra rb 0 1825 0
unpack_high_vu16
0 ra rb 0 1891 0
unpack_high_vu32
0 ra rb 0 1955 0
unpack_high_vu8
0 ra rb 0 1827 0
unpack_low_vf16
0 ra rb 0 1345 0
unpack_low_vf32
0 ra rb 0 1505 0
unpack_low_vf64
0 ra rb 0 1665 0
unpack_low_vi16
0 ra rb 0 1888 0
unpack_low_vi32
0 ra rb 0 1952 0
unpack_low_vi8
0 ra rb 0 1824 0
unpack_low_vu16
0 ra rb 0 1890 0
unpack_low_vu32
0 ra rb 0 1954 0
unpack_low_vu8
0 ra rb 0 1826 0
write
2 127 uimm28
xnor
0 ra rb rc 10 0
xor
0 ra rb rc 6 0
xor_dec
0 ra rb rc 240 0
xor_imm
22 ra rb simm21
zext_i16
0 ra rb 0 41 0
zext_i32
0 ra rb 0 42 0
zext_i64
0 ra rb 0 43 0
zext_i8
0 ra rb 0 40 0

Opcode groups

Primary opcodes (122 from 128)
misc 0 fused 1 raopx 2 br_misc 3
loop 4 br_32 5 br_64 6 br_128 7
mem_xi64 8 mem_xi32 9 mem_xu32 10 mem_xu64 11
fma_f16 12 fma_f32 13 fma_f64 14 fma_f128 15
ld_imm 16 ld_imm_high 17 andn_imm 18 orn_imm 19
and_imm 20 or_imm 21 xor_imm 22 deposit 23
add_imm_i32 24 add_imm_u32 25 subr_imm_i32 26 subr_imm_u32 27
mul_imm_i32 28 mul_imm_u32 29 add_imm_i64 30 subr_imm_i64 31
mul_imm_i64 32 reserved 33 reserved 34 reserved 35
div_imm_i64 36 div_imm_u64 37 rem_imm_i64 38 rem_imm_u64 39
div_imm_i32 40 div_imm_u32 41 rem_imm_i32 42 rem_imm_u32 43
max_imm_i64 44 max_imm_u64 45 min_imm_i64 46 min_imm_u64 47
max_imm_i32 48 max_imm_u32 49 min_imm_i32 50 min_imm_u32 51
cmp_eq_imm_i64 52 cmp_ne_imm_i64 53 cmp_lt_imm_i64 54 cmp_ge_imm_i64 55
cmp_lt_imm_u64 56 cmp_ge_imm_u64 57 cmp_eq_imm_i32 58 cmp_ne_imm_i32 59
cmp_lt_imm_i32 60 cmp_ge_imm_i32 61 cmp_lt_imm_u32 62 cmp_ge_imm_u32 63
cmp_eq_imm_i128 64 cmp_ne_imm_i128 65 cmp_lt_imm_i128 66 cmp_ge_imm_i128 67
cmp_lt_imm_u128 68 cmp_ge_imm_u128 69 add_imm_i128 70 reserved 71
ld_u8 72 ld_i8 73 st_i8 74 ld_u16 75
ld_i16 76 st_i16 77 ld_u32 78 ld_i32 79
st_i32 80 ld_u64 81 ld_i64 82 st_i64 83
ld_i128 84 st_i128 85 reserved 86 reserved 87
ld_iprel_u8 88 ld_iprel_i8 89 st_iprel_i8 90 ld_iprel_u16 91
ld_iprel_i16 92 st_iprel_i16 93 ld_iprel_u32 94 ld_iprel_i32 95
st_iprel_i32 96 ld_iprel_u64 97 ld_iprel_i64 98 st_iprel_i64 99
ld_iprel_i128 100 st_iprel_i128 101 lda_iprel 102 call_plt 103
call 104 lda_r 105 br_eq_imm_i32 106 br_ne_imm_i32 107
br_lt_imm_i32 108 br_ge_imm_i32 109 br_lt_imm_u32 110 br_ge_imm_u32 111
br_eq_imm_i64 112 br_ne_imm_i64 113 br_lt_imm_i64 114 br_ge_imm_i64 115
br_lt_imm_u64 116 br_ge_imm_u64 117 br_eq_imm_i128 118 br_ne_imm_i128 119
br_lt_imm_i128 120 br_ge_imm_i128 121 br_lt_imm_u128 122 br_ge_imm_u128 123
br_mask_all 124 br_mask_notall 125 br_mask_none 126 br_mask_any 127

The «br_misc» extended opcodes (4 from 16)
br_bc 0 br_bc_imm 1 br_bs 2 br_bs_imm 3
reserved 4 reserved 5 reserved 6 reserved 7
reserved 8 reserved 9 reserved 10 reserved 11
reserved 12 reserved 13 reserved 14 reserved 15

The «br_32» extended opcodes (16 from 16)
br_eq_i32 0 br_ne_i32 1 br_lt_i32 2 br_ge_i32 3
br_lt_u32 4 br_ge_u32 5 br_oeq_f32 6 br_one_f32 7
br_olt_f32 8 br_oge_f32 9 br_o_f32 10 br_ueq_f32 11
br_une_f32 12 br_ult_f32 13 br_uge_f32 14 br_u_f32 15

The «br_64» extended opcodes (16 from 16)
br_eq_i64 0 br_ne_i64 1 br_lt_i64 2 br_ge_i64 3
br_lt_u64 4 br_ge_u64 5 br_oeq_f64 6 br_one_f64 7
br_olt_f64 8 br_oge_f64 9 br_o_f64 10 br_ueq_f64 11
br_une_f64 12 br_ult_f64 13 br_uge_f64 14 br_u_f64 15

The «br_128» extended opcodes (16 from 16)
br_eq_i128 0 br_ne_i128 1 br_lt_i128 2 br_ge_i128 3
br_lt_u128 4 br_ge_u128 5 br_oeq_f128 6 br_one_f128 7
br_olt_f128 8 br_oge_f128 9 br_o_f128 10 br_ueq_f128 11
br_une_f128 12 br_ult_f128 13 br_uge_f128 14 br_u_f128 15

The «fused» extended opcodes (87 from 128)
nul_misc 0 nul_32 1 nul_64 2 nul_128 3
reserved 4 reserved 5 reserved 6 reserved 7
add_add_i64 8 add_sub_i64 9 sub_sub_i64 10 add_addc_u64 11
sub_subb_u64 12 slp_i32 13 srp_i32 14 srp_imm_i32 15
slp_i64 16 srp_i64 17 srp_imm_i64 18 slsrl_u64 19
slsra_i64 20 slsrl_imm_u64 21 slsra_imm_i64 22 sl_add_i64 23
sl_sub_i64 24 sl_subr_i64 25 slsrl_u32 26 slsra_i32 27
mbsel 28 perm 29 mov2 30 alignup_u64 31
crc32c 32 reserved 33 reserved 34 reserved 35
reserved 36 sl_add_i32 37 sl_add_u32 38 sl_subr_i32 39
sl_subr_u32 40 sl_sub_i32 41 sl_sub_u32 42 deposit_r 43
slp_i128 44 srp_i128 45 srp_imm_i128 46 reserved 47
reserved 48 reserved 49 sl_or 50 sl_xor 51
reserved 52 reserved 53 reserved 54 reserved 55
reserved 56 reserved 57 reserved 58 reserved 59
cmov_lsb 60 cmov_eq_i32 61 cmov_lt_i32 62 cmov_le_i32 63
reserved 64 cmov_eq_i64 65 cmov_lt_i64 66 cmov_le_i64 67
reserved 68 cmov_eq_i128 69 cmov_lt_i128 70 cmov_le_i128 71
mul_add 72 mul_sub 73 mul_subr 74 merge_f16 75
merge_f32 76 merge_f64 77 merge_f128 78 merge_vf16 79
merge_vf32 80 merge_vf64 81 reserved 82 reserved 83
reserved 84 reserved 85 reserved 86 reserved 87
reserved 88 reserved 89 reserved 90 reserved 91
reserved 92 reserved 93 reserved 94 reserved 95
reserved 96 reserved 97 reserved 98 reserved 99
call_mi 100 call_rvt 101 lda_n 102 lda_nrc 103
reserved 104 reserved 105 nul_eq_imm_i32 106 nul_ne_imm_i32 107
nul_lt_imm_i32 108 nul_ge_imm_i32 109 nul_lt_imm_u32 110 nul_ge_imm_u32 111
nul_eq_imm_i64 112 nul_ne_imm_i64 113 nul_lt_imm_i64 114 nul_ge_imm_i64 115
nul_lt_imm_u64 116 nul_ge_imm_u64 117 nul_eq_imm_i128 118 nul_ne_imm_i128 119
nul_lt_imm_i128 120 nul_ge_imm_i128 121 nul_lt_imm_u128 122 nul_ge_imm_u128 123
nul_mask_all 124 nul_mask_notall 125 nul_mask_none 126 nul_mask_any 127

The «nul_misc» extended opcodes (4 from 16)
nul_bc 0 nul_bc_imm 1 nul_bs 2 nul_bs_imm 3
reserved 4 reserved 5 reserved 6 reserved 7
reserved 8 reserved 9 reserved 10 reserved 11
reserved 12 reserved 13 reserved 14 reserved 15

The «nul_32» extended opcodes (16 from 16)
nul_eq_i32 0 nul_ne_i32 1 nul_lt_i32 2 nul_ge_i32 3
nul_lt_u32 4 nul_ge_u32 5 nul_oeq_f32 6 nul_one_f32 7
nul_olt_f32 8 nul_oge_f32 9 nul_o_f32 10 nul_ueq_f32 11
nul_une_f32 12 nul_ult_f32 13 nul_uge_f32 14 nul_u_f32 15

The «nul_64» extended opcodes (16 from 16)
nul_eq_i64 0 nul_ne_i64 1 nul_lt_i64 2 nul_ge_i64 3
nul_lt_u64 4 nul_ge_u64 5 nul_oeq_f64 6 nul_one_f64 7
nul_olt_f64 8 nul_oge_f64 9 nul_o_f64 10 nul_ueq_f64 11
nul_une_f64 12 nul_ult_f64 13 nul_uge_f64 14 nul_u_f64 15

The «nul_128» extended opcodes (16 from 16)
nul_eq_i128 0 nul_ne_i128 1 nul_lt_i128 2 nul_ge_i128 3
nul_lt_u128 4 nul_ge_u128 5 nul_oeq_f128 6 nul_one_f128 7
nul_olt_f128 8 nul_oge_f128 9 nul_o_f128 10 nul_ueq_f128 11
nul_une_f128 12 nul_ult_f128 13 nul_uge_f128 14 nul_u_f128 15

The «mem_xi64» extended opcodes (15 from 16)
ld_xi64_u8 0 ld_xi64_i8 1 st_xi64_i8 2 ld_xi64_u16 3
ld_xi64_i16 4 st_xi64_i16 5 ld_xi64_u32 6 ld_xi64_i32 7
st_xi64_i32 8 ld_xi64_u64 9 ld_xi64_i64 10 st_xi64_i64 11
ld_xi64_i128 12 st_xi64_i128 13 lda_xi64 14 reserved 15

The «mem_xu64» extended opcodes (15 from 16)
ld_xu64_u8 0 ld_xu64_i8 1 st_xu64_i8 2 ld_xu64_u16 3
ld_xu64_i16 4 st_xu64_i16 5 ld_xu64_u32 6 ld_xu64_i32 7
st_xu64_i32 8 ld_xu64_u64 9 ld_xu64_i64 10 st_xu64_i64 11
ld_xu64_i128 12 st_xu64_i128 13 lda_xu64 14 reserved 15

The «mem_xi32» extended opcodes (15 from 16)
ld_xi32_u8 0 ld_xi32_i8 1 st_xi32_i8 2 ld_xi32_u16 3
ld_xi32_i16 4 st_xi32_i16 5 ld_xi32_u32 6 ld_xi32_i32 7
st_xi32_i32 8 ld_xi32_u64 9 ld_xi32_i64 10 st_xi32_i64 11
ld_xi32_i128 12 st_xi32_i128 13 lda_xi32 14 reserved 15

The «mem_xu32» extended opcodes (15 from 16)
ld_xu32_u8 0 ld_xu32_i8 1 st_xu32_i8 2 ld_xu32_u16 3
ld_xu32_i16 4 st_xu32_i16 5 ld_xu32_u32 6 ld_xu32_i32 7
st_xu32_i32 8 ld_xu32_u64 9 ld_xu32_i64 10 st_xu32_i64 11
ld_xu32_i128 12 st_xu32_i128 13 lda_xu32 14 reserved 15

The «fma_f16» extended opcodes (10 from 16)
madd_f16 0 msub_f16 1 nmadd_f16 2 nmsub_f16 3
madd_vf16 4 msub_vf16 5 nmadd_vf16 6 nmsub_vf16 7
madd_alt_vf16 8 msub_alt_vf16 9 reserved 10 reserved 11
reserved 12 reserved 13 reserved 14 reserved 15

The «fma_f32» extended opcodes (10 from 16)
madd_f32 0 msub_f32 1 nmadd_f32 2 nmsub_f32 3
madd_vf32 4 msub_vf32 5 nmadd_vf32 6 nmsub_vf32 7
madd_alt_vf32 8 msub_alt_vf32 9 reserved 10 reserved 11
reserved 12 reserved 13 reserved 14 reserved 15

The «fma_f64» extended opcodes (10 from 16)
madd_f64 0 msub_f64 1 nmadd_f64 2 nmsub_f64 3
madd_vf64 4 msub_vf64 5 nmadd_vf64 6 nmsub_vf64 7
madd_alt_vf64 8 msub_alt_vf64 9 reserved 10 reserved 11
reserved 12 reserved 13 reserved 14 reserved 15

The «fma_f128» extended opcodes (9 from 16)
madd_f128 0 msub_f128 1 nmadd_f128 2 nmsub_f128 3
reserved 4 reserved 5 reserved 6 reserved 7
reserved 8 reserved 9 reserved 10 amo_cas_u8 11
amo_cas_u16 12 amo_cas_u32 13 amo_cas_u64 14 amo_cas_u128 15

The «loop» extended opcodes (16 from 16)
rep_lt_i64 0 rep_gt_i64 1 rep_le_i64 2 rep_ge_i64 3
rep_lt_u64 4 rep_gt_u64 5 rep_le_u64 6 rep_ge_u64 7
rep_lt_i32 8 rep_gt_i32 9 rep_le_i32 10 rep_ge_i32 11
rep_lt_u32 12 rep_gt_u32 13 rep_le_u32 14 rep_ge_u32 15

The «raopx» extended opcodes (14 from 128)
nop 0 jmp 1 retf 2 alloc 3
alloc_sp 4 reserved 5 reserved 6 reserved 7
eh_adj 8 eh_throw 9 eh_catch 10 eh_next 11
reserved 12 reserved 13 reserved 14 reserved 15
dcbt 16 dcbf 17 dcbi 18 icbi 19
reserved 20 reserved 21 reserved 22 reserved 23
reserved 24 reserved 25 reserved 26 reserved 27
reserved 28 reserved 29 reserved 30 reserved 31
reserved 32 reserved 33 reserved 34 reserved 35
reserved 36 reserved 37 reserved 38 reserved 39
reserved 40 reserved 41 reserved 42 reserved 43
reserved 44 reserved 45 reserved 46 reserved 47
reserved 48 reserved 49 reserved 50 reserved 51
reserved 52 reserved 53 reserved 54 reserved 55
reserved 56 reserved 57 reserved 58 reserved 59
reserved 60 reserved 61 reserved 62 reserved 63
reserved 64 reserved 65 reserved 66 reserved 67
reserved 68 reserved 69 reserved 70 reserved 71
reserved 72 reserved 73 reserved 74 reserved 75
reserved 76 reserved 77 reserved 78 reserved 79
reserved 80 reserved 81 reserved 82 reserved 83
reserved 84 reserved 85 reserved 86 reserved 87
reserved 88 reserved 89 reserved 90 reserved 91
reserved 92 reserved 93 reserved 94 reserved 95
reserved 96 reserved 97 reserved 98 reserved 99
reserved 100 reserved 101 reserved 102 reserved 103
reserved 104 reserved 105 reserved 106 reserved 107
reserved 108 reserved 109 reserved 110 reserved 111
reserved 112 reserved 113 reserved 114 reserved 115
reserved 116 reserved 117 reserved 118 reserved 119
reserved 120 reserved 121 reserved 122 reserved 123
reserved 124 reserved 125 reserved 126 write 127

The «misc» extended opcodes (720 from 2048)
undef 0 mov 1 ret 2 not 3
and 4 or 5 xor 6 andn 7
nand 8 nor 9 xnor 10 orn 11
jmp_r 12 jmp_t 13 jmp_t_i32 14 jmp_t_u32 15
call_ri 16 reserved 17 reserved 18 reserved 19
mbgath 20 mbscat 21 gtb 22 reserved 23
cnt_pop 24 cnt_lz 25 cnt_tz 26 permb 27
bit_clear 28 bit_clear_imm 29 bit_set 30 bit_set_imm 31
bit_flip 32 bit_flip_imm 33 reserved 34 reserved 35
sext_i8 36 sext_i16 37 sext_i32 38 sext_i64 39
zext_i8 40 zext_i16 41 zext_i32 42 zext_i64 43
reserved 44 reserved 45 reserved 46 reserved 47
reserved 48 reserved 49 reserved 50 reserved 51
reserved 52 reserved 53 reserved 54 reserved 55
reserved 56 reserved 57 reserved 58 reserved 59
reserved 60 reserved 61 reserved 62 reserved 63
reserved 64 reserved 65 reserved 66 reserved 67
reserved 68 reserved 69 reserved 70 reserved 71
reserved 72 reserved 73 reserved 74 reserved 75
reserved 76 reserved 77 reserved 78 reserved 79
reserved 80 reserved 81 reserved 82 reserved 83
reserved 84 reserved 85 reserved 86 reserved 87
reserved 88 reserved 89 reserved 90 reserved 91
reserved 92 reserved 93 reserved 94 reserved 95
ld_mia_u8 96 ld_mia_i8 97 st_mia_i8 98 ld_mia_u16 99
ld_mia_i16 100 st_mia_i16 101 ld_mia_u32 102 ld_mia_i32 103
st_mia_i32 104 ld_mia_u64 105 ld_mia_i64 106 st_mia_i64 107
ld_mia_i128 108 st_mia_i128 109 reserved 110 reserved 111
ld_mib_u8 112 ld_mib_i8 113 st_mib_i8 114 ld_mib_u16 115
ld_mib_i16 116 st_mib_i16 117 ld_mib_u32 118 ld_mib_i32 119
st_mib_i32 120 ld_mib_u64 121 ld_mib_i64 122 st_mib_i64 123
ld_mib_i128 124 st_mib_i128 125 reserved 126 reserved 127
add_i32 128 sub_i32 129 mul_i32 130 neg_i32 131
sll_imm_u32 132 srl_imm_u32 133 sra_imm_i32 134 divp2_imm_i32 135
sll_u32 136 srl_u32 137 sra_i32 138 divp2_i32 139
cmp_eq_i32 140 cmp_ne_i32 141 cmp_lt_i32 142 cmp_ge_i32 143
cmp_lt_u32 144 cmp_ge_u32 145 abs_diff_i32 146 abs_i32 147
div_i32 148 div_u32 149 rem_i32 150 rem_u32 151
max_i32 152 max_u32 153 min_i32 154 min_u32 155
reserved 156 reserved 157 reserved 158 reserved 159
mul_u32 160 add_u32 161 sub_u32 162 reserved 163
reserved 164 reserved 165 reserved 166 reserved 167
reserved 168 reserved 169 reserved 170 reserved 171
reserved 172 reserved 173 reserved 174 reserved 175
reserved 176 reserved 177 reserved 178 reserved 179
reserved 180 reserved 181 reserved 182 reserved 183
reserved 184 reserved 185 reserved 186 reserved 187
reserved 188 reserved 189 reserved 190 reserved 191
reserved 192 reserved 193 reserved 194 reserved 195
reserved 196 reserved 197 reserved 198 reserved 199
reserved 200 reserved 201 reserved 202 reserved 203
reserved 204 reserved 205 reserved 206 reserved 207
add_i64 208 sub_i64 209 mul_i64 210 neg_i64 211
sll_imm_u64 212 srl_imm_u64 213 sra_imm_i64 214 divp2_imm_i64 215
sll_u64 216 srl_u64 217 sra_i64 218 divp2_i64 219
cmp_eq_i64 220 cmp_ne_i64 221 cmp_lt_i64 222 cmp_ge_i64 223
cmp_lt_u64 224 cmp_ge_u64 225 abs_diff_i64 226 abs_i64 227
div_i64 228 div_u64 229 rem_i64 230 rem_u64 231
max_i64 232 max_u64 233 min_i64 234 min_u64 235
reserved 236 reserved 237 reserved 238 reserved 239
xor_dec 240 and_dec 241 and_neg 242 addo_i64 243
subo_i64 244 addc_u64 245 subb_u64 246 mul_h 247
reserved 248 reserved 249 reserved 250 reserved 251
reserved 252 reserved 253 reserved 254 reserved 255
reserved 256 reserved 257 reserved 258 reserved 259
reserved 260 reserved 261 reserved 262 reserved 263
reserved 264 reserved 265 reserved 266 reserved 267
reserved 268 reserved 269 reserved 270 reserved 271
reserved 272 reserved 273 reserved 274 reserved 275
reserved 276 reserved 277 reserved 278 reserved 279
reserved 280 reserved 281 reserved 282 reserved 283
reserved 284 reserved 285 reserved 286 reserved 287
add_i128 288 sub_i128 289 mul_i128 290 neg_i128 291
sll_imm_u128 292 srl_imm_u128 293 sra_imm_i128 294 divp2_imm_i128 295
sll_u128 296 srl_u128 297 sra_i128 298 divp2_i128 299
cmp_eq_i128 300 cmp_ne_i128 301 cmp_lt_i128 302 cmp_ge_i128 303
cmp_lt_u128 304 cmp_ge_u128 305 abs_diff_i128 306 abs_i128 307
div_i128 308 div_u128 309 rem_i128 310 rem_u128 311
max_i128 312 max_u128 313 min_i128 314 min_u128 315
reserved 316 reserved 317 reserved 318 reserved 319
reserved 320 reserved 321 reserved 322 reserved 323
reserved 324 reserved 325 reserved 326 reserved 327
reserved 328 reserved 329 reserved 330 reserved 331
reserved 332 reserved 333 reserved 334 reserved 335
reserved 336 reserved 337 reserved 338 reserved 339
reserved 340 reserved 341 reserved 342 reserved 343
reserved 344 reserved 345 reserved 346 reserved 347
reserved 348 reserved 349 reserved 350 reserved 351
reserved 352 reserved 353 reserved 354 reserved 355
reserved 356 reserved 357 reserved 358 reserved 359
reserved 360 reserved 361 reserved 362 reserved 363
reserved 364 reserved 365 reserved 366 reserved 367
reserved 368 reserved 369 reserved 370 reserved 371
reserved 372 reserved 373 reserved 374 reserved 375
reserved 376 reserved 377 reserved 378 reserved 379
reserved 380 reserved 381 reserved 382 reserved 383
reserved 384 reserved 385 reserved 386 reserved 387
reserved 388 reserved 389 reserved 390 reserved 391
reserved 392 reserved 393 reserved 394 reserved 395
reserved 396 reserved 397 reserved 398 reserved 399
reserved 400 reserved 401 reserved 402 reserved 403
reserved 404 reserved 405 reserved 406 reserved 407
reserved 408 reserved 409 reserved 410 reserved 411
reserved 412 reserved 413 reserved 414 reserved 415
reserved 416 reserved 417 reserved 418 reserved 419
reserved 420 reserved 421 reserved 422 reserved 423
reserved 424 reserved 425 reserved 426 reserved 427
reserved 428 reserved 429 reserved 430 reserved 431
reserved 432 reserved 433 reserved 434 reserved 435
reserved 436 reserved 437 reserved 438 reserved 439
reserved 440 reserved 441 reserved 442 reserved 443
reserved 444 reserved 445 reserved 446 reserved 447
reserved 448 reserved 449 reserved 450 reserved 451
reserved 452 reserved 453 reserved 454 reserved 455
reserved 456 reserved 457 reserved 458 reserved 459
reserved 460 reserved 461 reserved 462 reserved 463
reserved 464 reserved 465 reserved 466 reserved 467
reserved 468 reserved 469 reserved 470 reserved 471
reserved 472 reserved 473 reserved 474 reserved 475
reserved 476 reserved 477 reserved 478 reserved 479
reserved 480 reserved 481 reserved 482 reserved 483
reserved 484 reserved 485 reserved 486 reserved 487
reserved 488 reserved 489 reserved 490 reserved 491
reserved 492 reserved 493 reserved 494 reserved 495
reserved 496 reserved 497 reserved 498 reserved 499
reserved 500 reserved 501 reserved 502 reserved 503
reserved 504 reserved 505 reserved 506 reserved 507
reserved 508 reserved 509 reserved 510 reserved 511
mprobe 512 cpuid 513 int 514 syscall 515
random 516 get_spr 517 set_spr 518 set_dbr 519
get_dbr 520 set_ibr 521 get_ibr 522 set_mr 523
get_mr 524 set_itr 525 set_dtr 526 rfi 527
halt 528 tpa 529 ptc 530 reserved 531
reserved 532 reserved 533 sysret 534 reserved 535
rscover 536 rsflush 537 rsload 538 reserved 539
reserved 540 reserved 541 reserved 542 reserved 543
clmul 544 reserved 545 reserved 546 reserved 547
reserved 548 reserved 549 reserved 550 reserved 551
aes_enc 552 aes_enc_last 553 aes_dec 554 aes_dec_last 555
aes_imc 556 aes_keygen_assist 557 reserved 558 reserved 559
reserved 560 reserved 561 reserved 562 jmp_mi 563
fence 564 reserved 565 reserved 566 reserved 567
reserved 568 reserved 569 reserved 570 reserved 571
reserved 572 reserved 573 reserved 574 reserved 575
reserved 576 reserved 577 reserved 578 reserved 579
reserved 580 reserved 581 reserved 582 reserved 583
reserved 584 reserved 585 reserved 586 reserved 587
reserved 588 reserved 589 reserved 590 reserved 591
reserved 592 reserved 593 reserved 594 reserved 595
reserved 596 reserved 597 reserved 598 reserved 599
reserved 600 reserved 601 reserved 602 reserved 603
reserved 604 reserved 605 reserved 606 reserved 607
reserved 608 reserved 609 reserved 610 reserved 611
reserved 612 reserved 613 reserved 614 reserved 615
reserved 616 reserved 617 reserved 618 reserved 619
reserved 620 reserved 621 reserved 622 reserved 623
reserved 624 reserved 625 reserved 626 reserved 627
reserved 628 reserved 629 reserved 630 reserved 631
reserved 632 reserved 633 reserved 634 reserved 635
reserved 636 reserved 637 reserved 638 reserved 639
reserved 640 reserved 641 reserved 642 reserved 643
reserved 644 reserved 645 reserved 646 reserved 647
reserved 648 reserved 649 reserved 650 reserved 651
reserved 652 reserved 653 reserved 654 reserved 655
reserved 656 reserved 657 reserved 658 reserved 659
reserved 660 reserved 661 reserved 662 reserved 663
reserved 664 reserved 665 reserved 666 reserved 667
reserved 668 reserved 669 reserved 670 reserved 671
reserved 672 reserved 673 reserved 674 reserved 675
reserved 676 reserved 677 reserved 678 reserved 679
reserved 680 reserved 681 reserved 682 reserved 683
reserved 684 reserved 685 reserved 686 reserved 687
reserved 688 reserved 689 reserved 690 reserved 691
reserved 692 reserved 693 reserved 694 reserved 695
reserved 696 reserved 697 reserved 698 reserved 699
reserved 700 reserved 701 reserved 702 reserved 703
reserved 704 reserved 705 reserved 706 reserved 707
reserved 708 reserved 709 reserved 710 reserved 711
reserved 712 reserved 713 reserved 714 reserved 715
reserved 716 reserved 717 reserved 718 reserved 719
reserved 720 reserved 721 reserved 722 reserved 723
reserved 724 reserved 725 reserved 726 reserved 727
reserved 728 reserved 729 reserved 730 reserved 731
reserved 732 reserved 733 reserved 734 reserved 735
reserved 736 reserved 737 reserved 738 reserved 739
reserved 740 reserved 741 reserved 742 reserved 743
reserved 744 reserved 745 reserved 746 reserved 747
reserved 748 reserved 749 reserved 750 reserved 751
reserved 752 reserved 753 reserved 754 reserved 755
reserved 756 reserved 757 reserved 758 reserved 759
reserved 760 reserved 761 reserved 762 reserved 763
reserved 764 reserved 765 reserved 766 reserved 767
amo_ld_u8 768 amo_st_u8 769 amo_swap_u8 770 amo_add_u8 771
amo_and_u8 772 amo_or_u8 773 amo_xor_u8 774 amo_min_i8 775
amo_max_i8 776 amo_min_u8 777 amo_max_u8 778 amo_sub_u8 779
reserved 780 reserved 781 reserved 782 reserved 783
reserved 784 reserved 785 reserved 786 reserved 787
reserved 788 reserved 789 reserved 790 reserved 791
reserved 792 reserved 793 reserved 794 reserved 795
reserved 796 reserved 797 reserved 798 reserved 799
reserved 800 reserved 801 reserved 802 reserved 803
reserved 804 reserved 805 reserved 806 reserved 807
amo_ld_u16 808 amo_st_u16 809 amo_swap_u16 810 amo_add_u16 811
amo_and_u16 812 amo_or_u16 813 amo_xor_u16 814 amo_min_i16 815
amo_max_i16 816 amo_min_u16 817 amo_max_u16 818 amo_sub_u16 819
reserved 820 reserved 821 reserved 822 reserved 823
reserved 824 reserved 825 reserved 826 reserved 827
reserved 828 reserved 829 reserved 830 reserved 831
reserved 832 reserved 833 reserved 834 reserved 835
reserved 836 reserved 837 reserved 838 reserved 839
reserved 840 reserved 841 reserved 842 reserved 843
reserved 844 reserved 845 reserved 846 reserved 847
amo_ld_u32 848 amo_st_u32 849 amo_swap_u32 850 amo_add_u32 851
amo_and_u32 852 amo_or_u32 853 amo_xor_u32 854 amo_min_i32 855
amo_max_i32 856 amo_min_u32 857 amo_max_u32 858 amo_sub_u32 859
reserved 860 reserved 861 reserved 862 reserved 863
reserved 864 reserved 865 reserved 866 reserved 867
reserved 868 reserved 869 reserved 870 reserved 871
reserved 872 reserved 873 reserved 874 reserved 875
reserved 876 reserved 877 reserved 878 reserved 879
reserved 880 reserved 881 reserved 882 reserved 883
reserved 884 reserved 885 reserved 886 reserved 887
amo_ld_u64 888 amo_st_u64 889 amo_swap_u64 890 amo_add_u64 891
amo_and_u64 892 amo_or_u64 893 amo_xor_u64 894 amo_min_i64 895
amo_max_i64 896 amo_min_u64 897 amo_max_u64 898 amo_sub_u64 899
reserved 900 reserved 901 reserved 902 reserved 903
reserved 904 reserved 905 reserved 906 reserved 907
reserved 908 reserved 909 reserved 910 reserved 911
reserved 912 reserved 913 reserved 914 reserved 915
reserved 916 reserved 917 reserved 918 reserved 919
reserved 920 reserved 921 reserved 922 reserved 923
reserved 924 reserved 925 reserved 926 reserved 927
amo_ld_u128 928 amo_st_u128 929 amo_swap_u128 930 amo_add_u128 931
amo_and_u128 932 amo_or_u128 933 amo_xor_u128 934 amo_min_i128 935
amo_max_i128 936 amo_min_u128 937 amo_max_u128 938 amo_sub_u128 939
reserved 940 reserved 941 reserved 942 reserved 943
reserved 944 reserved 945 reserved 946 reserved 947
reserved 948 reserved 949 reserved 950 reserved 951
reserved 952 reserved 953 reserved 954 reserved 955
reserved 956 reserved 957 reserved 958 reserved 959
reserved 960 reserved 961 reserved 962 reserved 963
reserved 964 reserved 965 reserved 966 reserved 967
reserved 968 reserved 969 reserved 970 reserved 971
reserved 972 reserved 973 reserved 974 reserved 975
reserved 976 reserved 977 reserved 978 reserved 979
reserved 980 reserved 981 reserved 982 reserved 983
reserved 984 reserved 985 reserved 986 reserved 987
reserved 988 reserved 989 reserved 990 reserved 991
reserved 992 reserved 993 reserved 994 reserved 995
reserved 996 reserved 997 reserved 998 reserved 999
reserved 1000 reserved 1001 reserved 1002 reserved 1003
reserved 1004 reserved 1005 reserved 1006 reserved 1007
reserved 1008 reserved 1009 reserved 1010 reserved 1011
reserved 1012 reserved 1013 reserved 1014 reserved 1015
reserved 1016 reserved 1017 reserved 1018 reserved 1019
reserved 1020 reserved 1021 reserved 1022 reserved 1023
reserved 1024 reserved 1025 reserved 1026 reserved 1027
reserved 1028 reserved 1029 reserved 1030 reserved 1031
reserved 1032 reserved 1033 reserved 1034 reserved 1035
reserved 1036 reserved 1037 reserved 1038 reserved 1039
reserved 1040 reserved 1041 reserved 1042 reserved 1043
reserved 1044 reserved 1045 reserved 1046 reserved 1047
reserved 1048 reserved 1049 reserved 1050 reserved 1051
reserved 1052 reserved 1053 reserved 1054 reserved 1055
reserved 1056 reserved 1057 reserved 1058 reserved 1059
reserved 1060 reserved 1061 reserved 1062 reserved 1063
reserved 1064 reserved 1065 reserved 1066 reserved 1067
reserved 1068 reserved 1069 reserved 1070 reserved 1071
reserved 1072 reserved 1073 reserved 1074 reserved 1075
reserved 1076 reserved 1077 reserved 1078 reserved 1079
reserved 1080 reserved 1081 reserved 1082 reserved 1083
reserved 1084 reserved 1085 reserved 1086 reserved 1087
reserved 1088 reserved 1089 reserved 1090 reserved 1091
reserved 1092 reserved 1093 reserved 1094 reserved 1095
reserved 1096 reserved 1097 reserved 1098 reserved 1099
reserved 1100 reserved 1101 reserved 1102 reserved 1103
reserved 1104 reserved 1105 reserved 1106 reserved 1107
reserved 1108 reserved 1109 reserved 1110 reserved 1111
reserved 1112 reserved 1113 reserved 1114 reserved 1115
reserved 1116 reserved 1117 reserved 1118 reserved 1119
reserved 1120 reserved 1121 reserved 1122 reserved 1123
reserved 1124 reserved 1125 reserved 1126 reserved 1127
reserved 1128 reserved 1129 reserved 1130 reserved 1131
reserved 1132 reserved 1133 reserved 1134 reserved 1135
reserved 1136 reserved 1137 reserved 1138 reserved 1139
reserved 1140 reserved 1141 reserved 1142 reserved 1143
reserved 1144 reserved 1145 reserved 1146 reserved 1147
reserved 1148 reserved 1149 reserved 1150 reserved 1151
reserved 1152 reserved 1153 reserved 1154 reserved 1155
reserved 1156 reserved 1157 reserved 1158 reserved 1159
reserved 1160 reserved 1161 reserved 1162 reserved 1163
reserved 1164 reserved 1165 reserved 1166 reserved 1167
reserved 1168 reserved 1169 reserved 1170 reserved 1171
reserved 1172 reserved 1173 reserved 1174 reserved 1175
reserved 1176 reserved 1177 reserved 1178 reserved 1179
reserved 1180 reserved 1181 reserved 1182 reserved 1183
reserved 1184 reserved 1185 reserved 1186 reserved 1187
reserved 1188 reserved 1189 reserved 1190 reserved 1191
reserved 1192 reserved 1193 reserved 1194 reserved 1195
reserved 1196 reserved 1197 reserved 1198 reserved 1199
reserved 1200 reserved 1201 reserved 1202 reserved 1203
reserved 1204 reserved 1205 reserved 1206 reserved 1207
reserved 1208 reserved 1209 reserved 1210 reserved 1211
reserved 1212 reserved 1213 reserved 1214 reserved 1215
class_f16 1216 sqrt_f16 1217 rsqrt_f16 1218 add_f16 1219
sub_f16 1220 nadd_f16 1221 mul_f16 1222 nmul_f16 1223
div_f16 1224 neg_f16 1225 abs_f16 1226 nabs_f16 1227
abs_diff_f16 1228 nabs_diff_f16 1229 min_f16 1230 max_f16 1231
minnum_f16 1232 maxnum_f16 1233 abs_min_f16 1234 abs_max_f16 1235
round_f16 1236 roundnx_f16 1237 reserved 1238 reserved 1239
cmp_oeq_f16 1240 cmp_one_f16 1241 cmp_olt_f16 1242 cmp_oge_f16 1243
cmp_o_f16 1244 cmp_ueq_f16 1245 cmp_une_f16 1246 cmp_ult_f16 1247
cmp_uge_f16 1248 cmp_u_f16 1249 reserved 1250 reserved 1251
reserved 1252 reserved 1253 reserved 1254 reserved 1255
cvt_f16_i32 1256 cvt_f16_u32 1257 cvt_i32_f16 1258 cvt_u32_f16 1259
cvt_f16_i64 1260 cvt_f16_u64 1261 cvt_i64_f16 1262 cvt_u64_f16 1263
cvt_f16_i128 1264 cvt_f16_u128 1265 cvt_i128_f16 1266 cvt_u128_f16 1267
reserved 1268 reserved 1269 reserved 1270 reserved 1271
extend_f16_f32 1272 extend_f16_f64 1273 cvt_f32_f16 1274 cvt_f64_f16 1275
reserved 1276 reserved 1277 reserved 1278 reserved 1279
reserved 1280 reserved 1281 reserved 1282 reserved 1283
reserved 1284 reserved 1285 reserved 1286 reserved 1287
reserved 1288 reserved 1289 reserved 1290 reserved 1291
reserved 1292 reserved 1293 reserved 1294 reserved 1295
neg_vf16 1296 abs_vf16 1297 nabs_vf16 1298 abs_diff_vf16 1299
nabs_diff_vf16 1300 rsqrt_vf16 1301 sqrt_vf16 1302 add_vf16 1303
sub_vf16 1304 nadd_vf16 1305 mul_vf16 1306 nmul_vf16 1307
div_vf16 1308 min_vf16 1309 max_vf16 1310 minnum_vf16 1311
maxnum_vf16 1312 abs_min_vf16 1313 abs_max_vf16 1314 round_vf16 1315
roundnx_vf16 1316 reserved 1317 reserved 1318 reserved 1319
reserved 1320 reserved 1321 reserved 1322 reserved 1323
cmp_oeq_vf16 1324 cmp_one_vf16 1325 cmp_olt_vf16 1326 cmp_oge_vf16 1327
cmp_o_vf16 1328 cmp_ueq_vf16 1329 cmp_une_vf16 1330 cmp_ult_vf16 1331
cmp_uge_vf16 1332 cmp_u_vf16 1333 reserved 1334 reserved 1335
add_alt_vf16 1336 sub_alt_vf16 1337 add_horiz_vf16 1338 sub_horiz_vf16 1339
mul_horiz_vf16 1340 dot_vf16 1341 merge_low_vf16 1342 merge_high_vf16 1343
unpack_high_vf16 1344 unpack_low_vf16 1345 pack_vf16 1346 cvt_vf16_vi16 1347
cvt_vf16_vu16 1348 cvt_vi16_vf16 1349 cvt_vu16_vf16 1350 reserved 1351
reserved 1352 reserved 1353 reserved 1354 reserved 1355
reserved 1356 reserved 1357 reserved 1358 reserved 1359
reserved 1360 reserved 1361 reserved 1362 reserved 1363
reserved 1364 reserved 1365 reserved 1366 reserved 1367
reserved 1368 reserved 1369 reserved 1370 reserved 1371
reserved 1372 reserved 1373 reserved 1374 reserved 1375
class_f32 1376 sqrt_f32 1377 rsqrt_f32 1378 add_f32 1379
sub_f32 1380 nadd_f32 1381 mul_f32 1382 nmul_f32 1383
div_f32 1384 neg_f32 1385 abs_f32 1386 nabs_f32 1387
abs_diff_f32 1388 nabs_diff_f32 1389 min_f32 1390 max_f32 1391
minnum_f32 1392 maxnum_f32 1393 abs_min_f32 1394 abs_max_f32 1395
round_f32 1396 roundnx_f32 1397 reserved 1398 reserved 1399
cmp_oeq_f32 1400 cmp_one_f32 1401 cmp_olt_f32 1402 cmp_oge_f32 1403
cmp_o_f32 1404 cmp_ueq_f32 1405 cmp_une_f32 1406 cmp_ult_f32 1407
cmp_uge_f32 1408 cmp_u_f32 1409 reserved 1410 reserved 1411
reserved 1412 reserved 1413 reserved 1414 reserved 1415
cvt_f32_i32 1416 cvt_f32_u32 1417 cvt_i32_f32 1418 cvt_u32_f32 1419
cvt_f32_i64 1420 cvt_f32_u64 1421 cvt_i64_f32 1422 cvt_u64_f32 1423
cvt_f32_i128 1424 cvt_f32_u128 1425 cvt_i128_f32 1426 cvt_u128_f32 1427
reserved 1428 reserved 1429 reserved 1430 reserved 1431
extend_f32_f64 1432 cvt_f64_f32 1433 reserved 1434 reserved 1435
reserved 1436 reserved 1437 reserved 1438 reserved 1439
reserved 1440 reserved 1441 reserved 1442 reserved 1443
reserved 1444 reserved 1445 reserved 1446 reserved 1447
reserved 1448 reserved 1449 reserved 1450 reserved 1451
reserved 1452 reserved 1453 reserved 1454 reserved 1455
neg_vf32 1456 abs_vf32 1457 nabs_vf32 1458 abs_diff_vf32 1459
nabs_diff_vf32 1460 rsqrt_vf32 1461 sqrt_vf32 1462 add_vf32 1463
sub_vf32 1464 nadd_vf32 1465 mul_vf32 1466 nmul_vf32 1467
div_vf32 1468 min_vf32 1469 max_vf32 1470 minnum_vf32 1471
maxnum_vf32 1472 abs_min_vf32 1473 abs_max_vf32 1474 round_vf32 1475
roundnx_vf32 1476 reserved 1477 reserved 1478 reserved 1479
reserved 1480 reserved 1481 reserved 1482 reserved 1483
cmp_oeq_vf32 1484 cmp_one_vf32 1485 cmp_olt_vf32 1486 cmp_oge_vf32 1487
cmp_o_vf32 1488 cmp_ueq_vf32 1489 cmp_une_vf32 1490 cmp_ult_vf32 1491
cmp_uge_vf32 1492 cmp_u_vf32 1493 reserved 1494 reserved 1495
add_alt_vf32 1496 sub_alt_vf32 1497 add_horiz_vf32 1498 sub_horiz_vf32 1499
mul_horiz_vf32 1500 dot_vf32 1501 merge_low_vf32 1502 merge_high_vf32 1503
unpack_high_vf32 1504 unpack_low_vf32 1505 pack_vf32 1506 cvt_vf32_vi32 1507
cvt_vf32_vu32 1508 cvt_vi32_vf32 1509 cvt_vu32_vf32 1510 reserved 1511
reserved 1512 reserved 1513 reserved 1514 reserved 1515
reserved 1516 reserved 1517 reserved 1518 reserved 1519
reserved 1520 reserved 1521 reserved 1522 reserved 1523
reserved 1524 reserved 1525 reserved 1526 reserved 1527
reserved 1528 reserved 1529 reserved 1530 reserved 1531
reserved 1532 reserved 1533 reserved 1534 reserved 1535
class_f64 1536 sqrt_f64 1537 rsqrt_f64 1538 add_f64 1539
sub_f64 1540 nadd_f64 1541 mul_f64 1542 nmul_f64 1543
div_f64 1544 neg_f64 1545 abs_f64 1546 nabs_f64 1547
abs_diff_f64 1548 nabs_diff_f64 1549 min_f64 1550 max_f64 1551
minnum_f64 1552 maxnum_f64 1553 abs_min_f64 1554 abs_max_f64 1555
round_f64 1556 roundnx_f64 1557 reserved 1558 reserved 1559
cmp_oeq_f64 1560 cmp_one_f64 1561 cmp_olt_f64 1562 cmp_oge_f64 1563
cmp_o_f64 1564 cmp_ueq_f64 1565 cmp_une_f64 1566 cmp_ult_f64 1567
cmp_uge_f64 1568 cmp_u_f64 1569 reserved 1570 reserved 1571
reserved 1572 reserved 1573 reserved 1574 reserved 1575
cvt_f64_i32 1576 cvt_f64_u32 1577 cvt_i32_f64 1578 cvt_u32_f64 1579
cvt_f64_i64 1580 cvt_f64_u64 1581 cvt_i64_f64 1582 cvt_u64_f64 1583
cvt_f64_i128 1584 cvt_f64_u128 1585 cvt_i128_f64 1586 cvt_u128_f64 1587
reserved 1588 reserved 1589 reserved 1590 reserved 1591
reserved 1592 reserved 1593 reserved 1594 reserved 1595
reserved 1596 reserved 1597 reserved 1598 reserved 1599
reserved 1600 reserved 1601 reserved 1602 reserved 1603
reserved 1604 reserved 1605 reserved 1606 reserved 1607
reserved 1608 reserved 1609 reserved 1610 reserved 1611
reserved 1612 reserved 1613 reserved 1614 reserved 1615
neg_vf64 1616 abs_vf64 1617 nabs_vf64 1618 abs_diff_vf64 1619
nabs_diff_vf64 1620 rsqrt_vf64 1621 sqrt_vf64 1622 add_vf64 1623
sub_vf64 1624 nadd_vf64 1625 mul_vf64 1626 nmul_vf64 1627
div_vf64 1628 min_vf64 1629 max_vf64 1630 minnum_vf64 1631
maxnum_vf64 1632 abs_min_vf64 1633 abs_max_vf64 1634 round_vf64 1635
roundnx_vf64 1636 reserved 1637 reserved 1638 reserved 1639
reserved 1640 reserved 1641 reserved 1642 reserved 1643
cmp_oeq_vf64 1644 cmp_one_vf64 1645 cmp_olt_vf64 1646 cmp_oge_vf64 1647
cmp_o_vf64 1648 cmp_ueq_vf64 1649 cmp_une_vf64 1650 cmp_ult_vf64 1651
cmp_uge_vf64 1652 cmp_u_vf64 1653 reserved 1654 reserved 1655
add_alt_vf64 1656 sub_alt_vf64 1657 add_horiz_vf64 1658 sub_horiz_vf64 1659
mul_horiz_vf64 1660 dot_vf64 1661 merge_low_vf64 1662 merge_high_vf64 1663
unpack_high_vf64 1664 unpack_low_vf64 1665 pack_vf64 1666 cvt_vf64_vi64 1667
cvt_vf64_vu64 1668 cvt_vi64_vf64 1669 cvt_vu64_vf64 1670 reserved 1671
reserved 1672 reserved 1673 reserved 1674 reserved 1675
reserved 1676 reserved 1677 reserved 1678 reserved 1679
reserved 1680 reserved 1681 reserved 1682 reserved 1683
reserved 1684 reserved 1685 reserved 1686 reserved 1687
reserved 1688 reserved 1689 reserved 1690 reserved 1691
reserved 1692 reserved 1693 reserved 1694 reserved 1695
class_f128 1696 sqrt_f128 1697 rsqrt_f128 1698 add_f128 1699
sub_f128 1700 nadd_f128 1701 mul_f128 1702 nmul_f128 1703
div_f128 1704 neg_f128 1705 abs_f128 1706 nabs_f128 1707
abs_diff_f128 1708 nabs_diff_f128 1709 min_f128 1710 max_f128 1711
minnum_f128 1712 maxnum_f128 1713 abs_min_f128 1714 abs_max_f128 1715
round_f128 1716 roundnx_f128 1717 reserved 1718 reserved 1719
cmp_oeq_f128 1720 cmp_one_f128 1721 cmp_olt_f128 1722 cmp_oge_f128 1723
cmp_o_f128 1724 cmp_ueq_f128 1725 cmp_une_f128 1726 cmp_ult_f128 1727
cmp_uge_f128 1728 cmp_u_f128 1729 reserved 1730 reserved 1731
reserved 1732 reserved 1733 reserved 1734 reserved 1735
cvt_f128_i32 1736 cvt_f128_u32 1737 cvt_i32_f128 1738 cvt_u32_f128 1739
cvt_f128_i64 1740 cvt_f128_u64 1741 cvt_i64_f128 1742 cvt_u64_f128 1743
cvt_f128_i128 1744 cvt_f128_u128 1745 cvt_i128_f128 1746 cvt_u128_f128 1747
reserved 1748 reserved 1749 reserved 1750 reserved 1751
extend_f32_f128 1752 extend_f64_f128 1753 extend_f16_f128 1754 cvt_f128_f64 1755
cvt_f128_f32 1756 cvt_f128_f16 1757 scale_f128 1758 reserved 1759
reserved 1760 reserved 1761 reserved 1762 reserved 1763
reserved 1764 reserved 1765 reserved 1766 reserved 1767
reserved 1768 reserved 1769 reserved 1770 reserved 1771
reserved 1772 reserved 1773 reserved 1774 reserved 1775
reserved 1776 reserved 1777 reserved 1778 reserved 1779
reserved 1780 reserved 1781 reserved 1782 reserved 1783
reserved 1784 reserved 1785 reserved 1786 reserved 1787
reserved 1788 reserved 1789 reserved 1790 reserved 1791
max_vi8 1792 max_vu8 1793 min_vi8 1794 min_vu8 1795
add_vu8 1796 sub_vu8 1797 addo_vi8 1798 subo_vi8 1799
addc_vu8 1800 subb_vu8 1801 add_sat_vu8 1802 add_sat_vi8 1803
sub_sat_vi8 1804 sub_sat_vu8 1805 avg_vi8 1806 avg_vu8 1807
cmp_eq_vi8 1808 cmp_lt_vi8 1809 cmp_lt_vu8 1810 sll_vu8 1811
sll_imm_vu8 1812 srl_vu8 1813 srl_imm_vu8 1814 sra_vi8 1815
sra_imm_vi8 1816 rol_vu8 1817 ror_vu8 1818 merge_high_vu8 1819
merge_low_vu8 1820 reserved 1821 reserved 1822 reserved 1823
unpack_low_vi8 1824 unpack_high_vi8 1825 unpack_low_vu8 1826 unpack_high_vu8 1827
reserved 1828 reserved 1829 reserved 1830 reserved 1831
reserved 1832 reserved 1833 reserved 1834 reserved 1835
reserved 1836 reserved 1837 reserved 1838 reserved 1839
reserved 1840 reserved 1841 reserved 1842 reserved 1843
reserved 1844 reserved 1845 reserved 1846 reserved 1847
reserved 1848 reserved 1849 reserved 1850 reserved 1851
reserved 1852 reserved 1853 reserved 1854 reserved 1855
max_vi16 1856 max_vu16 1857 min_vi16 1858 min_vu16 1859
add_vu16 1860 sub_vu16 1861 addo_vi16 1862 subo_vi16 1863
addc_vu16 1864 subb_vu16 1865 add_sat_vu16 1866 add_sat_vi16 1867
sub_sat_vi16 1868 sub_sat_vu16 1869 avg_vi16 1870 avg_vu16 1871
cmp_eq_vi16 1872 cmp_lt_vi16 1873 cmp_lt_vu16 1874 sll_vu16 1875
sll_imm_vu16 1876 srl_vu16 1877 srl_imm_vu16 1878 sra_vi16 1879
sra_imm_vi16 1880 rol_vu16 1881 ror_vu16 1882 merge_high_vu16 1883
merge_low_vu16 1884 reserved 1885 reserved 1886 reserved 1887
unpack_low_vi16 1888 unpack_high_vi16 1889 unpack_low_vu16 1890 unpack_high_vu16 1891
pack_sat_vi16 1892 pack_sat_vu16 1893 pack_mod_vu16 1894 pack_usat_vi16 1895
reserved 1896 reserved 1897 reserved 1898 reserved 1899
reserved 1900 reserved 1901 reserved 1902 reserved 1903
reserved 1904 reserved 1905 reserved 1906 reserved 1907
reserved 1908 reserved 1909 reserved 1910 reserved 1911
reserved 1912 reserved 1913 reserved 1914 reserved 1915
reserved 1916 reserved 1917 reserved 1918 reserved 1919
max_vi32 1920 max_vu32 1921 min_vi32 1922 min_vu32 1923
add_vu32 1924 sub_vu32 1925 addo_vi32 1926 subo_vi32 1927
addc_vu32 1928 subb_vu32 1929 add_sat_vu32 1930 add_sat_vi32 1931
sub_sat_vi32 1932 sub_sat_vu32 1933 avg_vi32 1934 avg_vu32 1935
cmp_eq_vi32 1936 cmp_lt_vi32 1937 cmp_lt_vu32 1938 sll_vu32 1939
sll_imm_vu32 1940 srl_vu32 1941 srl_imm_vu32 1942 sra_vi32 1943
sra_imm_vi32 1944 rol_vu32 1945 ror_vu32 1946 merge_high_vu32 1947
merge_low_vu32 1948 reserved 1949 reserved 1950 reserved 1951
unpack_low_vi32 1952 unpack_high_vi32 1953 unpack_low_vu32 1954 unpack_high_vu32 1955
pack_sat_vi32 1956 pack_sat_vu32 1957 pack_mod_vu32 1958 pack_usat_vi32 1959
reserved 1960 reserved 1961 reserved 1962 reserved 1963
reserved 1964 reserved 1965 reserved 1966 reserved 1967
reserved 1968 reserved 1969 reserved 1970 reserved 1971
reserved 1972 reserved 1973 reserved 1974 reserved 1975
reserved 1976 reserved 1977 reserved 1978 reserved 1979
reserved 1980 reserved 1981 reserved 1982 reserved 1983
max_vi64 1984 max_vu64 1985 min_vi64 1986 min_vu64 1987
add_vu64 1988 sub_vu64 1989 addo_vi64 1990 subo_vi64 1991
addc_vu64 1992 subb_vu64 1993 add_sat_vu64 1994 add_sat_vi64 1995
sub_sat_vi64 1996 sub_sat_vu64 1997 avg_vi64 1998 avg_vu64 1999
cmp_eq_vi64 2000 cmp_lt_vi64 2001 cmp_lt_vu64 2002 sll_vu64 2003
sll_imm_vu64 2004 srl_vu64 2005 srl_imm_vu64 2006 sra_vi64 2007
sra_imm_vi64 2008 rol_vu64 2009 ror_vu64 2010 merge_high_vu64 2011
merge_low_vu64 2012 reserved 2013 reserved 2014 reserved 2015
reserved 2016 reserved 2017 reserved 2018 reserved 2019
pack_sat_vi64 2020 pack_sat_vu64 2021 pack_mod_vu64 2022 pack_usat_vi64 2023
reserved 2024 reserved 2025 reserved 2026 reserved 2027
reserved 2028 reserved 2029 reserved 2030 reserved 2031
reserved 2032 reserved 2033 reserved 2034 reserved 2035
reserved 2036 reserved 2037 reserved 2038 reserved 2039
reserved 2040 reserved 2041 reserved 2042 reserved 2043
reserved 2044 reserved 2045 reserved 2046 reserved 2047

Instruction set statistic

statistic by instruction subsets:
instruction subset all hardwired pseudo-ops
sum: 1167 1162 5
base 181 181 0
memory 112 112 0
branch 74 74 0
jump 28 28 0
nullifying 74 74 0
bitmanip 14 14 0
i128 41 41 0
f128 57 56 1
f64 104 102 2
f32 106 104 2
f16 106 106 0
mmx 140 140 0
special 19 19 0
atomic 66 66 0
privileged 17 17 0
cipher 8 8 0
group 20 20 0

statistic by instruction opcodes (1142 codes, 20 groups):
opcode num
primary opcodes 122
br_misc 4
br_32 16
br_64 16
br_128 16
fused 87
nul_misc 4
nul_32 16
nul_64 16
nul_128 16
mem_xi64 15
mem_xu64 15
mem_xi32 15
mem_xu32 15
fma_f16 10
fma_f32 10
fma_f64 10
fma_f128 9
loop 16
raopx 14
misc 720