+

WO2018133568A1 - Procédé et système de traitement d'informations neuronales en mode composé, et dispositif informatique - Google Patents

Procédé et système de traitement d'informations neuronales en mode composé, et dispositif informatique Download PDF

Info

Publication number
WO2018133568A1
WO2018133568A1 PCT/CN2017/114661 CN2017114661W WO2018133568A1 WO 2018133568 A1 WO2018133568 A1 WO 2018133568A1 CN 2017114661 W CN2017114661 W CN 2017114661W WO 2018133568 A1 WO2018133568 A1 WO 2018133568A1
Authority
WO
WIPO (PCT)
Prior art keywords
neuron
information
current
pulse
output information
Prior art date
Application number
PCT/CN2017/114661
Other languages
English (en)
Chinese (zh)
Inventor
裴京
邓磊
施路平
吴臻志
李国齐
Original Assignee
清华大学
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 清华大学 filed Critical 清华大学
Publication of WO2018133568A1 publication Critical patent/WO2018133568A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/049Temporal neural networks, e.g. delay elements, oscillating neurons or pulsed inputs
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/06Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons
    • G06N3/061Physical realisation, i.e. hardware implementation of neural networks, neurons or parts of neurons using biological neurons, e.g. biological neurons connected to an integrated circuit

Definitions

  • the present invention relates to the field of artificial neural network technology, and in particular to a composite mode neuron information processing method, system and computer device.
  • the neurons of the traditional neuromorphic system are limited to support a single information processing and transmission mode: artificial neural network or pulsed neural network, resulting in a high cost of neural network construction for a single task, and low efficiency of neural networks for multiple tasks.
  • Reading a neuron working mode configuration parameter the neuron working mode configuration parameter comprising an artificial neuron working mode configuration parameter or a pulsed neuron working mode configuration parameter;
  • the current neuron working mode includes an artificial neuron working mode or a pulsed neuron working mode
  • front-end neuron output information includes front-end artificial neuron output information or front-end pulse neuron output information;
  • Reading current neuron information including current artificial neuron information or current pulsed neuron information;
  • Calculating current neuron output information according to the front end neuron output information and the current neuron information including Calculating current artificial neuron output information according to the front end artificial neuron output information and the current artificial neuron information, or calculating current pulse neurons according to the front end pulse neuron output information and the current pulsed neuron information Output information;
  • the front end artificial neuron output information includes: membrane potential information output by the front end artificial neuron, and a connection weight index of the front end artificial neuron and the current artificial neuron;
  • the current artificial neuron information includes: current artificial neuron offset information;
  • the current artificial neuron output information including:
  • connection weight index of the front end artificial neuron and the current artificial neuron the connection weight of the front end artificial neuron and the current artificial neuron is read;
  • the membrane potential information output according to the front end artificial neuron, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron bias information are calculated by a preset artificial neuron activation function. After the step of outputting information by the artificial neurons, the method further includes:
  • the outputting the current artificial neuron output information includes:
  • the front-end pulse neuron output information includes: pulse tip information output by the front-end pulse neuron, and a connection weight index of the front-end pulse neuron and the current pulse neuron;
  • the current pulse neuron information includes: a current time window width, a pulse tip information sequence in a current time window, historical membrane potential information, and membrane potential leakage information;
  • the current pulse neuron output information including:
  • connection weight index of the front-end pulse neuron and the current pulse neuron the connection weight of the front-end pulse neuron and the current pulse neuron is read;
  • the pulsed nerve is passed through the pulse according to the front-end pulse neuron input information, the connection weight of the front-end pulse neuron and the current pulsed neuron, the historical membrane potential information, and the membrane potential leakage information.
  • the meta-calculation model after the step of calculating the current pulse neuron output information, before the step of outputting the current pulse neuron output information, the method further includes:
  • the issuing triggering flag information includes: issuing a trigger or issuing a non-trigger; and when the issuing triggering flag information is issued When triggered,
  • the refractory period timer is reset, and the historical membrane potential information is updated to a preset reset membrane potential information.
  • the method further includes:
  • the issue trigger flag information When the issue trigger flag information is not triggered by the issue, the current time step of the refractory period width and the refractory period timer is read;
  • the obtaining a threshold potential includes:
  • the threshold potential is determined based on the threshold random amount and the threshold offset.
  • the outputting the current pulse neuron output information comprises:
  • Reading a second issuance enablement identifier where the second issuance enablement identifier includes allowing data to be issued or not allowing data to be issued, and when the second issuance enablement identifier is allowed to issue data,
  • the artificial neuron output information is compared with the potential extremum, and if the artificial neuron output information is greater than or equal to the potential extremum, the potential extremum is updated to
  • the output information of the artificial neurons can make the neural network complete the operation of the pooling step directly in the convolutional layer, without allocating physical space for the pooled layer neurons, greatly saving the neuron resources, thereby reducing the construction cost of the neural network.
  • the current artificial neuron output information is determined by setting the release enable identifier and the artificial neuron release data type parameter, so that the output of the artificial neuron is more controllable, and the release enable flag can be configured with
  • the neurons are not allowed to issue data, but only used as intermediate auxiliary computational neurons, which is necessary for some functions that require multiple neurons to work together.
  • the release enable flag and the release type can work together to set the area.
  • the neuron outputs information to a current neuron output information (the neuron output information is the maximum value in each neuron output information of the set region), and the direct maximum pooling operation is completed.
  • the pulse tip information sequence in the current time window is updated according to the pulse tip information output by the front end pulse neuron and the pulse tip information sequence in the current time window, and the current time window pulse is acquired.
  • the tip information update sequence can support the spatiotemporal pulse neural network model with time depth according to the current time window width, the connection weight of the front end pulse neuron and the current pulse neuron, and the front end pulse neuron input information calculated by the attenuation function. Compared with the neural network technology scheme with only one time depth, the space-time information coding ability of the pulse neural network can be greatly improved, and the application space of the pulse neural network is enriched.
  • the threshold potential is determined by reading a random threshold mask potential and a threshold offset and receiving a configuration value given by a configuration register such that the neuron issues pulse tip information with a probability of randomness, Regardless of whether the membrane potential exceeds a fixed threshold bias, due to the existence of a positive or negative threshold random superposition amount, the neuron cell body may issue pulses, which improves the computational power and information processing capability of the pulse neural network model.
  • the present invention also provides a computer device comprising a memory, a processor, and a computer program stored on the memory and operable on the processor, the processor executing the computer program to implement the method of any of the above embodiments A step of
  • the invention also provides a composite mode neuron information processing system, comprising:
  • a neuron working mode reading module configured to read a neuron working mode configuration parameter, where the neuron working mode configuration parameter comprises an artificial neuron working mode configuration parameter or a pulsed neuron working mode configuration parameter;
  • a neuron working mode configuration module configured to configure a current neuron working mode according to the neuron working mode configuration parameter, where the current neuron working mode includes an artificial neuron working mode or a pulsed neuron working mode;
  • the front-end neuron output information receiving module is configured to receive front-end neuron output information, where the front-end neuron output information includes front-end artificial neuron output information or front-end pulse neuron output information;
  • a current neuron information reading module configured to read current neuron information, where the current neuron information includes current artificial neuron information or current pulsed neuron information;
  • the current neuron output information calculation module is configured to calculate current neuron output information according to the front end neuron output information and the current neuron information, including outputting information according to the front end artificial neuron and the current artificial neuron Information, calculating current artificial neuron output information, or calculating current pulse neuron output information according to the front end pulse neuron output information and the current pulse neuron information;
  • the current neuron output information output module is configured to output the current artificial neuron output information or the current pulse neuron output information.
  • the composite mode neuron information processing method, the computer device and the system provided by the present invention configure a corresponding artificial neuron working mode or a pulsed neuron working mode according to a preset neuron working mode configuration parameter, and the corresponding neuron In the working mode, the current artificial neuron output information or the current pulsed neuron output information is calculated and outputted by receiving the front-end neuron output information and reading the current neuron information.
  • the composite mode neuron information processing system provided by the invention can configure the corresponding neuron working mode according to the requirements of the task, and only needs to modify the working mode of the neuron when switching between different tasks in different neural network working modes.
  • the configuration parameters can be different from the neural network accelerator scheme that only supports the artificial neural network mode, and the neuromorphic scheme that only supports the pulse neural network.
  • the present invention can support the artificial neural network mode based on the same architecture.
  • Machine learning applications and computational neuroscience applications based on pulsed neural networks enrich the type of processing information for brain-like computing platforms, reduce the cost of multi-task execution in different neural network operating modes, and improve the working mode of different neural networks. The efficiency of multitasking.
  • FIG. 1 is a schematic flow chart of a multi-mode neural network information processing method according to an embodiment
  • FIG. 2 is a schematic flow chart of a multi-mode neural network information processing method according to another embodiment
  • FIG. 3 is a schematic flow chart of a multi-mode neural network information processing method according to still another embodiment
  • FIG. 4 is a schematic flow chart of a multi-mode neural network information processing method according to still another embodiment
  • FIG. 5 is a schematic flowchart diagram of a multi-mode neural network information processing method according to still another embodiment
  • FIG. 6 is a schematic flow chart of a multi-mode neural network information processing method according to still another embodiment
  • FIG. 7 is a schematic structural diagram of a multi-mode neural network information processing system according to an embodiment
  • FIG. 8 is a schematic structural diagram of a multi-mode neural network information processing system according to another embodiment.
  • FIG. 9 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • FIG. 10 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • FIG. 11 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment
  • FIG. 12 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • FIG. 1 is a schematic flowchart of a multi-mode neural network information processing method according to an embodiment, and the multi-mode neural network information processing method shown in FIG. 1 includes:
  • Step S100 reading a neuron working mode configuration parameter, where the neuron working mode configuration parameter comprises an artificial neuron working mode configuration parameter or a pulsed neuron working mode configuration parameter.
  • the present invention provides a multi-mode and multi-functional composite neuromorphic cell unit based on an all-digital circuit, integrating an artificial neural network (ANN) and a pulsed neural network (SNN). Two kinds of information processing and transmission modes.
  • the neuron cell unit can be set to the working mode of the ANN or SNN according to the processing requirements of different tasks.
  • the neuron working mode configuration parameter is a working mode of a well-defined neural network.
  • Step S200 Configure a current neuron working mode according to the neuron working mode configuration parameter, where the current neuron working mode includes an artificial neuron working mode or a pulsed neuron working mode.
  • the neuron cell unit may be set to an ANN or SNN working mode.
  • Step S300 receiving front-end neuron output information, where the front-end neuron output information includes front-end artificial neuron output Output information or front-end pulse neuron output information.
  • receiving different front-end neuron output information if configured as an artificial neuron working mode, receiving front-end artificial neuron output information, if configured as a pulsed neuron working mode, receiving The front-end pulsed neuron outputs information.
  • Step S400 reading current neuron information, the current neuron information including current artificial neuron information or current pulsed neuron information.
  • reading different current neuron information if configured as an artificial neuron working mode, reading current artificial neuron information, if configured as a pulsed neuron working mode, reading Take the current pulse neuron information.
  • Step S500 calculating current neuron output information according to the front-end neuron output information and the current neuron information, including calculating current artificial neurons according to the front-end artificial neuron output information and the current artificial neuron information Outputting information, or calculating current pulse neuron output information according to the front end pulse neuron output information and the current pulse neuron information.
  • the calculation process of the artificial neuron or the calculation process of the pulse neuron is performed, and the current artificial neuron output information or the current pulse neuron output information is obtained.
  • Step S600 outputting the current artificial neuron output information or the current pulse neuron output information.
  • the configured neuron working mode when working in the artificial neuron mode, outputting the current artificial neuron output information according to an output manner of the artificial neural network, when working in the pulsed neuron mode, The current pulsed neuron output information is output according to an output manner of the pulsed neural network.
  • the composite mode neuron information processing method provided by the invention provides a corresponding artificial neuron working mode or a pulsed neuron working mode according to a preset neuron working mode configuration parameter, and passes the corresponding neuron working mode. Receiving front-end neuron output information, and reading current neuron information, calculating current artificial neuron output information or current pulsed neuron output information, and outputting.
  • the composite mode neuron information processing method provided by the invention can configure the corresponding neuron working mode according to the requirements of the task, and only needs to modify the neuron working mode when switching between different tasks in different neural network working modes.
  • the configuration parameters can be different from the neural network accelerator scheme that only supports the artificial neural network mode, and the neuromorphic scheme that only supports the pulse neural network.
  • the present invention can support the artificial neural network mode based on the same architecture.
  • Machine learning applications and computational neuroscience applications based on pulsed neural networks enrich the type of processing information for brain-like computing platforms, reduce the cost of multi-task execution in different neural network operating modes, and improve the working mode of different neural networks. The efficiency of multitasking.
  • FIG. 2 is a schematic flowchart diagram of a multi-mode neural network information processing method according to another embodiment, and the multi-mode neural network information processing method shown in FIG. 2 is the method for configuring the neuron working mode in the method shown in FIG.
  • the refining steps of step S300 to step S500 include:
  • Step S100a receiving the membrane potential information output by the front end artificial neuron, and the connection weight index of the front end artificial neuron and the current artificial neuron.
  • connection weight index of the front end artificial neuron and the current artificial neuron is a weight index sent by the front end neuron together with the front end artificial neuron output information, and is used to indicate the extraction of the current neuron weight.
  • step S200a the current artificial neuron offset information is read.
  • the offset information of the artificial neuron is a membrane potential bias value.
  • Step S300a The connection weight of the front end artificial neuron and the current artificial neuron is read according to the connection weight index of the front end artificial neuron and the current artificial neuron.
  • connection weight index of the front end artificial neuron and the current artificial neuron is an address information
  • the current neuron is indexed according to the received connection weight of the front end artificial neuron and the current artificial neuron
  • the current nerve is
  • the connection weight of the front end artificial neuron and the current artificial neuron is read, and according to the connection weight information, the output information of the front end neuron can be used in the calculation process of participating in the current neuron output information. More accurately reflects the weight of the output information of the front-end neurons.
  • Step S400a according to the membrane potential information output by the front end artificial neuron, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron bias information, by a preset artificial neuron activation function , calculate the current artificial neuron output information.
  • V ANN f (V input + V bias )
  • V input is the input of the current beat accumulation, equivalent to the above
  • the composite mode neuron information processing system provided by the embodiment is configured as an artificial neuron working mode according to the requirements of the task, and can realize most of the current artificial neural network models and support the application of the brain computing platform in the field of machine learning.
  • the membrane potential information output according to the front end artificial neuron, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron bias information are preset.
  • the current artificial neuron output information is compared with the potential extreme value, and if the current artificial neuron output information is greater than or equal to the potential extreme value, the potential extreme value is updated to
  • the current artificial neuron output information enables the neural network to complete the operation of the pooling step directly in the convolutional layer without allocating physical space for the pooled layer neurons, thereby greatly saving neuron resources, thereby reducing the construction of the neural network. cost.
  • FIG. 3 is a schematic flowchart of a multi-mode neural network information processing method according to still another embodiment.
  • the multi-mode neural network information processing method shown in FIG. 3 is a refinement process of step S600 of FIG. 1, and includes:
  • Step S610a reading the first issuance enablement identifier, where the first issuance enablement identifier includes allowing data to be issued or not allowing data to be issued, when the first issuance enable identifier is allowed to issue data.
  • the first issuance enablement identifier is control information that is determined in the task to determine whether the final neuron output information is issued.
  • the first release enable identifier is not allowed to issue data
  • the The calculated artificial neuron output information is not allowed to be issued, and the process ends.
  • step S620a is followed.
  • step S620 the artificial neuron issuance data type parameter is read, and the artificial neuron issuance data type includes: issuing the current artificial neuron output information, issuing the potential extremum, and issuing an extremum corresponding to the potential extremum A neuron identifies one of them.
  • the artificial neuron issuance data type parameter is read, and according to the subsequent calculation requirement, different data types may be selected, for example, to satisfy subsequent convolution.
  • the maximum pooling operation in the neural network requires the output of the potential extremum.
  • Step S630a determining, according to the artificial neuron issuing data type parameter, that the current artificial neuron finally loses Information.
  • Step S640a outputting the final output information of the current artificial neuron.
  • the current artificial neuron output information is determined, so that the output of the artificial neuron is more controllable, and the release enable flag can be configured.
  • Neurons are not allowed to issue data, but only as intermediate auxiliary computing neurons, which is necessary for some functions that require multiple neurons to work together.
  • the release enable flag and the release type can work together to set the area of the nerve.
  • the meta-output information is reduced to a current neuron output information (the neuron output information is a maximum value in each neuron output information of the set region), and the direct maximum pooling operation is completed.
  • step S300 to step S500 includes:
  • Step S100b Receive pulse tip information output by the front end pulse neuron, and a connection weight index of the front end pulse neuron and the current pulse neuron.
  • connection weight index of the front-end pulse neuron and the current pulse neuron is a weight index sent by the front-end neuron together with the output information of the front-end pulse neuron, and is used to indicate the extraction of the current neuron weight.
  • the pulse tip information output by the front end pulse neuron is a pulse tip signal sent by the front end pulse neuron.
  • Step S200b reading the current time window width, the pulse tip information sequence in the current time window, the historical membrane potential information, and the membrane potential leakage information.
  • the pulse tip information sequence in the current time window refers to a sequence of information in which the pulse tip information received in a time step within a certain range in the current time window width is sequentially buffered in chronological order.
  • Step S300b reading the connection weight of the front-end pulse neuron and the current pulse neuron according to the connection weight index of the front-end pulse neuron and the current pulse neuron.
  • connection weight index of the front-end pulse neuron and the current pulse neuron is an address information
  • the current neuron is indexed according to the received connection weight of the front-end pulse neuron and the current pulse neuron.
  • the connection weight of the front-end pulse neuron and the current pulse neuron is read, and according to the connection weight information, the output information of the front-end neuron can be used in the calculation process of participating in the current neuron output information. More accurately reflects the weight of the output information of the front-end neurons, carrying more abundant information.
  • Step S400b updating the pulse tip information sequence in the current time window according to the pulse tip information output by the front-end pulse neuron and the pulse tip information sequence in the current time window, and acquiring the pulse tip information in the current time window. Update the sequence.
  • the pulse tip information sequence after storing a new pulse tip information in the operation step of each pulse neuron, deletes the pulse tip information at the tail position of the sequence, and updates the pulse tip sequence once. .
  • Step S500b Calculate the front-end pulse neuron input information by using the attenuation function according to the current time window width and the pulse tip information update sequence in the current time window.
  • T w is the time window width
  • ⁇ j is a pulse front information update sequence in the current time window after the front-end neuron j issues a spike in the current time window Time step.
  • K( ⁇ t) is an attenuation function that decreases rapidly as ⁇ t increases.
  • Step S600b calculating, according to the front-end pulse neuron input information, the connection weight of the front-end pulse neuron and the current pulsed neuron, the historical membrane potential information, and the membrane potential leakage information, by calculating a pulse neuron calculation model Current pulsed neuron output information.
  • the calculation of the input information of the front-end pulse neuron is represented by the following formula:
  • W ij is the connection weight of the front-end pulse neuron j and the current pulse neuron i
  • T w is the time window width
  • ⁇ j is the front-end neuron j after the spike is issued in the current time window, at the current The time step within the time window of the pulse tip information update sequence.
  • K( ⁇ t) is an attenuation function that decreases rapidly as ⁇ t increases.
  • V SNN f(V+V input +V leak )
  • V is the historical membrane potential information stored in the memory
  • V input is the input of the current beat accumulation, equivalent to the above V leak is the leak value information.
  • the pulse tip information sequence in the current time window is updated according to the pulse tip information output by the front end pulse neuron and the pulse tip information sequence in the current time window, and the pulse tip in the current time window is acquired.
  • the information update sequence can support the spatio-temporal pulse neural network model with time depth according to the current time window width, the connection weight of the front-end pulse neuron and the current pulsed neuron, and the front-end pulse neuron input information through the attenuation function.
  • the space-time information coding ability of the pulse neural network can be greatly improved, and the application space of the pulse neural network is enriched.
  • FIG. 5 is a schematic flowchart of a multi-mode neural network information processing method according to still another embodiment, and the multi-mode neural network information processing method shown in FIG. 5 is after all the steps shown in FIG. 4, before the step of S600 in FIG. ,include:
  • step S100c a threshold potential is acquired.
  • the acquiring a threshold potential includes: reading a random threshold mask potential, a threshold offset, and a random threshold; The random threshold and the random threshold mask potential are subjected to bitwise AND operation to obtain a threshold random superposition amount; and the threshold potential is determined according to the threshold random superimposition amount and the threshold offset.
  • the pseudo-random number generator generates a random threshold V rand , and uses the random threshold and the preset random threshold mask potential V mask to perform a bitwise AND operation to generate a threshold random superposition amount, and then the threshold random superposition amount and the pre- The set threshold offset Vth0 is added to produce a true threshold potential Vth .
  • the initial seed of the pseudo random number generator is given by the configuration register V seed .
  • Step S200c Comparing the current pulse neuron output information with the threshold potential, and determining the issuance trigger flag information according to the comparison result, where the issuing trigger flag information includes: issuing a trigger or issuing a non-trigger.
  • the process proceeds to step S300c, and when the issuance triggering flag information is not issued, the process proceeds to step S400c.
  • the current pulse neuron output information is compared, and the trigger trigger flag information is determined according to the comparison result. Only when the current pulse neuron output information is greater than or equal to the threshold potential, the current pulse neuron output information may be transmitted.
  • Step S300c resetting the refractory period timer, and updating the historical membrane potential information to a preset reset membrane potential information.
  • the issue trigger flag information is an issue trigger
  • the current pulse neuron output information may be sent, the refractory period timer is reset, and the historical membrane potential information is updated to a preset membrane potential information. And the historical membrane potential information is updated, and the membrane potential is selectively reset to the current membrane potential, the current membrane potential and the threshold potential difference, or a fixed reset voltage according to the configured reset type Reset_type.
  • Step S400c reading the current time step of the refractory period width and the refractory period timer.
  • the current pulse neuron output information is not sent, and further determining whether the current period is within the refractory period.
  • the refractory period width is greater than the value of the refractory period timer, which is timed by means of a time step.
  • Step S500c Determine whether the current time is within the refractory period according to the refractory period width and the current time step of the refractory period timer. If the current time is within the refractory period, step S600c is followed, and if the current time is outside the refractory period, step S700c is followed.
  • the cumulative calculation of the current time step of the refractory period timer it can be determined whether the current time step is still in the refractory period.
  • Step S600c accumulating the refractory period timer for one time step, and not updating the historical film potential information.
  • the pulsed neurons of the time step need to read the information, that is, during the refractory period, the pulsed neuron output information calculated this time does not participate in the calculation of the next time step.
  • Step S700c accumulating the refractory period timer for one time step, and updating the historical membrane potential information as the current pulse neuron output information.
  • the historical membrane potential information is updated to the current pulsed neuron output information to participate in the calculation of the next time step.
  • the threshold potential is determined by reading the random threshold mask potential and the threshold offset and receiving the configuration value given by the configuration register, so that the neuron issues the pulse tip information with a certain probability of randomness, regardless of Whether the membrane potential exceeds a fixed threshold bias, and because there is a positive or negative threshold random superposition amount, the neuron cell body may issue pulses, which improves the computational power and information processing capability of the pulse neural network model.
  • FIG. 6 is a schematic flowchart diagram of a multi-mode neural network information processing method according to still another embodiment, and the multi-mode neural network information processing method shown in FIG. 6 is configured for the neuron working mode in the method shown in FIG.
  • the refinement process of step S600 includes:
  • step S610b the second issuance enablement identifier is read, and the second issuance enablement identifier includes permission to issue data or disallowing to issue data, and when the second issuance enable identifier is allowed to issue data.
  • the second issuance enablement identifier is control information that determines whether the final neuron output information is issued.
  • the calculated artificial neuron The output information is not allowed to be issued and the process ends.
  • step S620b is followed.
  • Step S620b the issuing trigger flag information is read, when the issuing trigger flag information is an issue trigger.
  • the second issuance enable identifier is the permission method data
  • Step S630b outputting the current pulse neuron output information.
  • the release enable identifier and the issue trigger flag by setting the release enable identifier and the issue trigger flag, the current pulse neuron output information is determined, so that the output of the pulse neuron is more controllable, and the release enable flag can be configured with neurons that are not allowed. Issuing data, but only as an intermediate auxiliary computing neuron, is necessary for some functions that require multiple neurons to work together.
  • an embodiment of the present invention further provides a computer device, including a memory and a processor. And a computer program stored on the memory and operable on the processor, wherein the processor executes the computer program to implement the steps of the method mentioned in the above embodiments.
  • Non-volatile memory can include read only memory (ROM), programmable ROM (PROM), electrically programmable ROM (EPROM), electrically erasable programmable ROM (EEPROM), or flash memory.
  • Volatile memory can include random access memory (RAM) or external cache memory.
  • RAM is available in a variety of formats, such as static RAM (SRAM), dynamic RAM (DRAM), synchronous DRAM (SDRAM), double data rate SDRAM (DDRSDRAM), enhanced SDRAM (ESDRAM), synchronization chain.
  • SRAM static RAM
  • DRAM dynamic RAM
  • SDRAM synchronous DRAM
  • DDRSDRAM double data rate SDRAM
  • ESDRAM enhanced SDRAM
  • Synchlink DRAM SLDRAM
  • Memory Bus Radbus
  • RDRAM Direct RAM
  • DRAM Direct Memory Bus Dynamic RAM
  • RDRAM Memory Bus Dynamic RAM
  • FIG. 7 is a schematic structural diagram of a multi-mode neural network information processing system according to an embodiment.
  • the multi-mode neural network information processing system shown in FIG. 7 includes:
  • the neuron working mode reading module 100 is configured to read a neuron working mode configuration parameter, where the neuron working mode configuration parameter comprises an artificial neuron working mode configuration parameter or a pulsed neuron working mode configuration parameter.
  • the neuron working mode configuration module 200 is configured to configure a current neuron working mode according to the neuron working mode configuration parameter, where the current neuron working mode includes an artificial neuron working mode or a pulsed neuron working mode.
  • the front-end neuron output information receiving module 300 is configured to receive front-end neuron output information, where the front-end neuron output information includes front-end artificial neuron output information or front-end pulse neuron output information.
  • the current neuron information reading module 400 is configured to read current neuron information, where the current neuron information includes current artificial neuron information or current pulsed neuron information.
  • the current neuron output information calculation module 500 is configured to calculate current neuron output information according to the front end neuron output information and the current neuron information, including outputting information according to the front end artificial neuron and the current artificial neural network. Meta-information, calculating current artificial neuron output information, or calculating current pulse neuron output information according to the front-end pulse neuron output information and the current pulsed neuron information.
  • the current neuron output information output module 600 is configured to output the current artificial neuron output information or the current pulse neuron output information.
  • the composite mode neuron information processing system provided by the invention provides a corresponding artificial neuron working mode or a pulsed neuron working mode according to a preset neuron working mode configuration parameter, and passes the corresponding neuron working mode. Receiving front-end neuron output information, and reading current neuron information, calculating current artificial neuron output information or current pulsed neuron output information, and outputting.
  • the composite mode neuron information processing system provided by the invention can configure the corresponding neuron working mode according to the requirements of the task, and only needs to modify the working mode of the neuron when switching between different tasks in different neural network working modes.
  • the configuration parameters can be different from the neural network accelerator scheme that only supports the artificial neural network mode, and the neuromorphic scheme that only supports the pulse neural network.
  • the present invention can support the artificial neural network mode based on the same architecture.
  • Machine learning applications and computational neuroscience applications based on pulsed neural networks enrich the type of processing information for brain-like computing platforms, reduce the cost of multi-task execution in different neural network operating modes, and improve the working mode of different neural networks. The efficiency of multitasking.
  • FIG. 8 is a schematic structural diagram of a multi-mode neural network information processing system according to another embodiment.
  • the multi-mode neural network information processing system shown in FIG. 8 includes:
  • the front end artificial neuron output information receiving unit 100a is configured to receive the membrane potential information output by the front end artificial neuron, and the connection weight index of the front artificial liver neuron and the current artificial neuron.
  • the current artificial neuron information reading unit 200a is configured to read current artificial neuron offset information.
  • the artificial neuron connection weight reading unit 300a is configured to read the connection weight of the pre-artificial neuron and the current artificial neuron according to the connection weight index of the front-end artificial neuron and the current artificial neuron.
  • the current artificial neuron output information calculation unit 400a is configured to: according to the membrane potential information output by the front end artificial neuron, the connection weight of the front end artificial neuron and the current artificial neuron, and the current artificial neuron offset information, The current artificial neuron output information is calculated by a preset artificial neuron activation function.
  • the composite mode neuron information processing system provided by the embodiment is configured as an artificial neuron working mode according to the requirements of the task, and can realize most of the current artificial neural network models and support the application of the brain computing platform in the field of machine learning.
  • FIG. 9 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • the multi-mode neural network information processing system shown in FIG. 9 includes:
  • a potential extremum reading unit 500a for reading a potential extremum and an extremum neuron identifier corresponding to the potential extremum;
  • the potential extreme value comparison unit 600a is configured to compare the current artificial neuron output information with the potential extreme value, and if the current artificial neuron output information is greater than or equal to the potential extreme value,
  • the potential extremum updating unit 700a is configured to update the potential extremum to the current artificial neuron output information, and update the extremum neuron identifier to an identifier of the current artificial neuron.
  • the current artificial neuron output information is compared with the potential extreme value, and if the current artificial neuron output information is greater than or equal to the potential extreme value, the potential extreme value is updated to The artificial neuron outputs information, so that the neural network completes the operation of the pooling step directly in the convolution layer, without allocating physical space for the pooled layer neurons, greatly saving neuron resources, thereby reducing the construction cost of the neural network. .
  • FIG. 10 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • the multi-mode neural network information processing system shown in FIG. 10 includes:
  • the first issuance enablement reading unit 800a is configured to read the first issuance enablement identifier, where the first issue enablement identifier includes allowing data to be released or not allowing data to be issued, when the first issue enablement identifier is When data is allowed to be issued.
  • the artificial neuron issue data type reading unit 900a is configured to read an artificial neuron release data type parameter, and the artificial neuron issue data type includes: issuing the current artificial neuron output information, and issuing the potential extremum, One of the extreme neuron identifiers corresponding to the potential extremum is issued.
  • the artificial neuron issuance data type determining unit 910a is configured to determine the final output information of the current artificial neuron according to the artificial neuron issuing data type parameter.
  • the current artificial neuron output information output unit 920a is configured to output the final output information of the current artificial neuron.
  • the current artificial neuron output information is determined, so that the output of the artificial neuron is more controllable, and the release enable flag can be configured.
  • Neurons are not allowed to issue data, but only as intermediate auxiliary computational neurons, which is necessary for some functions that require multiple neurons to work together.
  • the release enable flag and the release type can work together to set the area.
  • the neuron output information is reduced to one current neuron output information (the neuron output information is a maximum value in each neuron output information of the set region), and the direct maximum pooling operation is completed.
  • FIG. 11 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • the multi-mode neural network information processing system shown in FIG. 11 includes:
  • the front end pulse neuron output information receiving unit 100b is configured to receive pulse tip information output by the front end pulse neuron, and a connection weight index of the front end pulse neuron and the current pulse neuron.
  • the current pulse neuron information reading unit 200b is configured to read the current time window width, the current time window intrapulse information sequence, the historical membrane potential information, and the membrane potential leakage information.
  • the pulse neuron connection weight reading unit 300b is configured to be based on the front end pulse neuron and the current pulse neuron The weight index is connected, and the connection weight of the front-end pulse neuron and the current pulse neuron is read.
  • the time-window pulse tip information sequence updating unit 400b is configured to update the current time window intra-pulse tip information sequence according to the pulse tip information output by the front-end pulse neuron and the current time-window pulse tip information sequence. Get the pulse tip information update sequence in the current time window.
  • the front end pulse neuron input information calculation unit 500b is configured to calculate front end pulse neuron input information by using an attenuation function according to the current time window width and the current time window intrapulse information update sequence.
  • the pulse neuron output information calculation unit 600b is configured to: according to the front end pulse neuron input information, a connection weight of the front end pulse neuron and the current pulse neuron, the historical membrane potential information, the membrane potential leakage information, The current pulsed neuron output information is calculated by a pulsed neuron calculation model.
  • the pulse tip information sequence in the current time window is updated according to the pulse tip information output by the front end pulse neuron and the pulse tip information sequence in the current time window, and the pulse tip in the current time window is acquired.
  • the information update sequence can support the spatio-temporal pulse neural network model with time depth according to the current time window width, the connection weight of the front-end pulse neuron and the current pulsed neuron, and the front-end pulse neuron input information through the attenuation function.
  • the space-time information coding ability of the pulse neural network can be greatly improved, and the application space of the pulse neural network is enriched.
  • FIG. 12 is a schematic structural diagram of a multi-mode neural network information processing system according to still another embodiment.
  • the multi-mode neural network information processing system shown in FIG. 12 includes:
  • a threshold potential obtaining unit 700b configured to acquire a threshold potential; comprising a threshold information receiving subunit for reading a random threshold mask potential, a threshold offset, and a random threshold; and a threshold random superposition amount obtaining subunit, configured to:
  • the threshold value and the random threshold mask potential are bitwise AND operated to obtain a threshold random superposition amount;
  • the threshold potential determination subunit is configured to determine the threshold potential according to the threshold random superposition amount and the threshold offset.
  • the issue trigger determination unit 800b is configured to compare the current pulse neuron output information with the threshold potential, and determine the issue trigger flag information according to the comparison result, where the issue trigger flag information includes: issuing a trigger or issuing a non-trigger.
  • the triggering action unit 910b is configured to reset the refractory period timer and update the historical membrane potential information to a preset reset membrane potential information.
  • a non-trigger action unit 920b including a refractory period sub-unit for reading a current time step of the refractory period width and the refractory period timer; and according to the refractory period width and the refractory period timer Current time step, determine current time Whether it is within the refractory period, if the refusal period judges that the subunit judges that the current time is within the refractory period, the subunit should not be run during the period, and is used to accumulate the refractory period timer for one time step.
  • the historical membrane potential information is not updated; if the refusal period judges that the subunit determines that the current time is not within the refractory period, the subunit may not be run out of time, for accumulating the refractory period timer A time step and updating the historical membrane potential information for the current pulsed neuron output information.
  • the second issuance enablement reading unit 930b is configured to read the second issuance enablement identifier, where the second issue enablement identifier includes allowing data to be issued or not allowed to be issued, and when the second issue enablement identifier is When the data is allowed to be distributed, the issuing trigger flag information reading unit is configured to read the issuing trigger flag information, and when the issuing trigger flag information is an issue trigger; the current pulse neuron output information output unit is configured to output The current pulsed neuron outputs information.
  • the release enable identifier and the issue trigger flag by setting the release enable identifier and the issue trigger flag, the current pulse neuron output information is determined, so that the output of the pulse neuron is more controllable, and the release enable flag can be configured with neurons that are not allowed. Issuing data, but only as an intermediate auxiliary computing neuron, is necessary for some functions that require multiple neurons to work together.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Theoretical Computer Science (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Artificial Intelligence (AREA)
  • Neurology (AREA)
  • Data Mining & Analysis (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Computing Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Feedback Control In General (AREA)

Abstract

La présente invention concerne un procédé et un système de traitement d'informations neuronales en mode composé, et un dispositif informatique. Le procédé consiste à : lire un paramètre de configuration de mode de fonctionnement de neurone, le paramètre de configuration de mode de fonctionnement de neurone comprenant un paramètre de configuration de mode de fonctionnement de neurone artificiel ou un paramètre de configuration de mode de fonctionnement de neurone impulsionnel ; configurer le mode de fonctionnement de neurone actuel selon le paramètre de configuration de mode de fonctionnement de neurone ; recevoir des informations de sortie de neurone d'extrémité ; lire les informations de neurone actuelles ; calculer les informations de sortie de neurone actuelles selon les informations de sortie de neurone d'extrémité et les informations de neurone actuelles ; et délivrer les informations de sortie de neurone artificiel actuelles ou les informations de sortie de neurone impulsionnel actuelles. Le procédé prend en charge simultanément un réseau neuronal artificiel et un réseau neuronal impulsionnel avec la même architecture, enrichit le type de traitement d'informations d'une plateforme de calcul de type cerveau, réduit le coût d'une exécution multitâche dans différents modes de fonctionnement de réseaux neuronaux et améliore l'efficacité de l'exécution multitâche dans différents modes de fonctionnement de réseaux neuronaux.
PCT/CN2017/114661 2017-01-20 2017-12-05 Procédé et système de traitement d'informations neuronales en mode composé, et dispositif informatique WO2018133568A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201710041892.3 2017-01-20
CN201710041892.3A CN106875004B (zh) 2017-01-20 2017-01-20 复合模式神经元信息处理方法和系统

Publications (1)

Publication Number Publication Date
WO2018133568A1 true WO2018133568A1 (fr) 2018-07-26

Family

ID=59158426

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2017/114661 WO2018133568A1 (fr) 2017-01-20 2017-12-05 Procédé et système de traitement d'informations neuronales en mode composé, et dispositif informatique

Country Status (2)

Country Link
CN (1) CN106875004B (fr)
WO (1) WO2018133568A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111082949A (zh) * 2019-10-29 2020-04-28 广东工业大学 一种类脑计算机中脉冲数据包高效传输方法

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106875004B (zh) * 2017-01-20 2019-09-10 北京灵汐科技有限公司 复合模式神经元信息处理方法和系统
CN107563503A (zh) * 2017-09-14 2018-01-09 胡明建 一种可编码择阀值择函数人工神经元的设计方法
CN107545304A (zh) * 2017-09-16 2018-01-05 胡明建 一种根据网络需求改变激活函数人工神经元的设计方法
CN107578096A (zh) * 2017-09-21 2018-01-12 胡明建 一种压频式择端人工神经元的设计方法
CN107578097A (zh) * 2017-09-25 2018-01-12 胡明建 一种多阀值多函数反馈人工神经元的设计方法
CN107609640A (zh) * 2017-10-01 2018-01-19 胡明建 一种阀值择端分级电位式人工神经元的设计方法
CN108171326B (zh) * 2017-12-22 2020-08-04 清华大学 神经网络的数据处理方法、装置、芯片、设备和存储介质
CN108764464B (zh) * 2018-04-12 2020-10-16 清华大学 神经元信息发送方法、装置和存储介质
CN109685252B (zh) * 2018-11-30 2023-04-07 西安工程大学 基于循环神经网络和多任务学习模型的建筑能耗预测方法
WO2021114133A1 (fr) * 2019-12-11 2021-06-17 Autonym Pte. Ltd. Procédé et système de prise de décision éclairée
CN114254106A (zh) * 2020-09-25 2022-03-29 北京灵汐科技有限公司 文本分类方法、装置、设备及存储介质
CN113569352A (zh) * 2021-07-13 2021-10-29 华中科技大学 基于机器学习的增材制造尺寸预测及工艺优化方法及系统

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831476A (zh) * 2012-08-22 2012-12-19 中国科学院上海光学精密机械研究所 脉冲神经网络模式探测装置和模式探测方法
CN105095966A (zh) * 2015-07-16 2015-11-25 清华大学 人工神经网络和脉冲神经网络的混合计算系统
CN105095961A (zh) * 2015-07-16 2015-11-25 清华大学 一种人工神经网络和脉冲神经网络的混合系统
CN105303235A (zh) * 2015-10-26 2016-02-03 清华大学 大规模分层神经网络的构建方法
CN106875004A (zh) * 2017-01-20 2017-06-20 清华大学 复合模式神经元信息处理方法和系统

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1381721A (zh) * 2002-06-04 2002-11-27 复旦大学 便携式智能电子鼻及其制备方法
US9111225B2 (en) * 2012-02-08 2015-08-18 Qualcomm Incorporated Methods and apparatus for spiking neural computation
US9292790B2 (en) * 2012-11-20 2016-03-22 Qualcom Incorporated Piecewise linear neuron modeling
US9418331B2 (en) * 2013-10-28 2016-08-16 Qualcomm Incorporated Methods and apparatus for tagging classes using supervised learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102831476A (zh) * 2012-08-22 2012-12-19 中国科学院上海光学精密机械研究所 脉冲神经网络模式探测装置和模式探测方法
CN105095966A (zh) * 2015-07-16 2015-11-25 清华大学 人工神经网络和脉冲神经网络的混合计算系统
CN105095961A (zh) * 2015-07-16 2015-11-25 清华大学 一种人工神经网络和脉冲神经网络的混合系统
CN105303235A (zh) * 2015-10-26 2016-02-03 清华大学 大规模分层神经网络的构建方法
CN106875004A (zh) * 2017-01-20 2017-06-20 清华大学 复合模式神经元信息处理方法和系统

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111082949A (zh) * 2019-10-29 2020-04-28 广东工业大学 一种类脑计算机中脉冲数据包高效传输方法
CN111082949B (zh) * 2019-10-29 2022-01-28 广东工业大学 一种类脑计算机中脉冲数据包高效传输方法

Also Published As

Publication number Publication date
CN106875004A (zh) 2017-06-20
CN106875004B (zh) 2019-09-10

Similar Documents

Publication Publication Date Title
WO2018133568A1 (fr) Procédé et système de traitement d'informations neuronales en mode composé, et dispositif informatique
CN111492381B (zh) 神经网络的功能子网络的同时训练
TWI503761B (zh) 用於在脈波編碼的網路中的突觸更新的裝置和方法
CN111738098B (zh) 一种车辆识别方法、装置、设备及存储介质
US11308395B2 (en) Method and system for performing machine learning
WO2018133570A1 (fr) Procédé de traitement d'informations neuronales à seuil auto-adaptatif, procédé et système de traitement d'informations neuronales à valeur de fuite auto-adaptative, et dispositif informatique et support d'informations lisible
CN109034371B (zh) 一种深度学习模型推理期加速方法、装置及系统
CN114781272B (zh) 碳排放量预测方法、装置、设备及存储介质
CN106875005B (zh) 自适应阈值神经元信息处理方法和系统
EP3567523B1 (fr) Évitement de regard détourné et de clignement dans des images photographiques
CN109995677A (zh) 资源分配方法、装置及存储介质
TWI729345B (zh) 事件預測方法及裝置、電子設備
CN111275054B (zh) 图像处理方法、装置、电子设备及存储介质
WO2024074072A1 (fr) Procédé et appareil d'apprentissage d'accélérateur de réseau de neurones impulsionnels, terminal et support de stockage
KR20160068823A (ko) 스파이킹 신경세포들의 망들에서의 정체 회피
JP2023548201A (ja) タスク学習システムおよび方法、ならびに関連デバイス
CN105659260B (zh) 动态地指派和检查突触延迟
WO2018133569A1 (fr) Procédé et système de traitement d'informations neuronales ayant un fenêtrage temporel profond
CN113268727A (zh) 联合训练模型方法、装置及计算机可读存储介质
CN117634564B (zh) 一种基于可编程神经拟态核的脉冲延时测量方法及系统
CN113269313A (zh) 突触权重训练方法、电子设备和计算机可读介质
CN106815638B (zh) 输入权重拓展的神经元信息处理方法和系统
WO2020019780A1 (fr) Procédé et appareil de prédiction d'événement et dispositif électronique
CN117521741A (zh) 基于事件驱动的突触更新方法及装置
CN108764464B (zh) 神经元信息发送方法、装置和存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17892997

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 17892997

Country of ref document: EP

Kind code of ref document: A1

点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载