+

CN114169240A - MMP prediction method and device based on conditional generative adversarial network - Google Patents

MMP prediction method and device based on conditional generative adversarial network Download PDF

Info

Publication number
CN114169240A
CN114169240A CN202111493520.7A CN202111493520A CN114169240A CN 114169240 A CN114169240 A CN 114169240A CN 202111493520 A CN202111493520 A CN 202111493520A CN 114169240 A CN114169240 A CN 114169240A
Authority
CN
China
Prior art keywords
mmp
data
training
generator
training sample
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202111493520.7A
Other languages
Chinese (zh)
Other versions
CN114169240B (en
Inventor
蒋丽丽
田冷
黄灿
王恒力
顾岱鸿
王嘉新
柴晓龙
王泽川
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
China University of Petroleum Beijing
Original Assignee
China University of Petroleum Beijing
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by China University of Petroleum Beijing filed Critical China University of Petroleum Beijing
Priority to CN202111493520.7A priority Critical patent/CN114169240B/en
Publication of CN114169240A publication Critical patent/CN114169240A/en
Application granted granted Critical
Publication of CN114169240B publication Critical patent/CN114169240B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/27Design optimisation, verification or simulation using machine learning, e.g. artificial intelligence, neural networks, support vector machines [SVM] or training a model
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • G06F30/28Design optimisation, verification or simulation using fluid dynamics, e.g. using Navier-Stokes equations or computational fluid dynamics [CFD]
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2113/00Details relating to the application field
    • G06F2113/08Fluids
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2119/00Details relating to the type or aim of the analysis or the optimisation
    • G06F2119/14Force analysis or force optimisation, e.g. static or dynamic forces

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Artificial Intelligence (AREA)
  • Mathematical Physics (AREA)
  • Computational Linguistics (AREA)
  • Computer Hardware Design (AREA)
  • General Health & Medical Sciences (AREA)
  • Data Mining & Analysis (AREA)
  • Biophysics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Geometry (AREA)
  • Molecular Biology (AREA)
  • Mathematical Analysis (AREA)
  • Mathematical Optimization (AREA)
  • Pure & Applied Mathematics (AREA)
  • Fluid Mechanics (AREA)
  • Algebra (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Management, Administration, Business Operations System, And Electronic Commerce (AREA)

Abstract

The embodiment of the invention discloses a method and a device for predicting MMP (matrix metalloproteinase) of a countermeasure network based on a condition generation formula, wherein the method comprises the following steps: acquiring MMP influence factor data of a target oil reservoir; inputting the MMP influencing factor data into a pre-trained generator to obtain an MMP predicted value of the target oil reservoir output by the pre-trained generator, wherein the generator is obtained by carrying out multiple iterative training according to a training sample set, and each training sample in the training sample set comprises: MMP values of the reservoir and MMP influential factor data of the reservoir. The method has the beneficial effect of accurately and efficiently predicting the MMP of the oil reservoir.

Description

MMP (matrix metalloproteinase) prediction method and device based on condition generation type countermeasure network
Technical Field
The invention relates to the technical field of oil reservoir development, in particular to an MMP prediction method and device based on a condition generating type countermeasure network.
Background
CO2Miscible flooding is low permeability reservoir CO2The EOR is the oil displacement mode which is most widely applied and has the highest recovery ratio. Injection of CO into oil reservoir2In the oil displacement process, the interaction of three phases of gas, oil and water can occur in the rock stratum. Resulting in phase-to-phase composition transfer, phase change and other complex phase behavior. The basic mechanism of miscible flooding is the displacing agent (CO)2Injected gas) and displaced agent (crude oil) form a stable heterogeneous leading edge under reservoir conditions, the leading edge being a single phase, the movement of which is effective to push the crude oil forward and ultimately to the production well. Because of miscible phase, the oil-gas interface disappears, and the interfacial tension in the porous medium is reduced to zero, so that the micro-displacement efficiency can reach 100% theoretically.
CO2The Minimum Miscible Pressure (MMP) with the reservoir crude oil is CO2One of the key parameters in the displacement process is the discrimination of CO2Boundary of miscible flooding and immiscible flooding. Accurate determination of CO2Minimum miscible pressure with crude oil versus CO increase2The miscible displacement efficiency, the reduction of the operation cost and the social and economic benefits are all very important.
The prior art typically determines MMP using experimental measurements, although this can be doneAccuracy is guaranteed, but this method is complex, time consuming and expensive to operate. Therefore, the prior art lacks a more efficient determination of CO2And Minimum Miscible Pressure (MMP) with reservoir crude oil.
Disclosure of Invention
In order to solve at least one technical problem in the background art, the present invention provides a method and an apparatus for predicting MMP based on a conditional generation countermeasure network.
In order to achieve the above object, according to an aspect of the present invention, there is provided an MMP prediction method for a conditional generation-based countermeasure network, the method including:
acquiring MMP influence factor data of a target oil reservoir;
inputting the MMP influencing factor data into a pre-trained generator to obtain an MMP predicted value of the target oil reservoir output by the pre-trained generator, wherein the generator is obtained by carrying out multiple iterative training according to a training sample set, and each training sample in the training sample set comprises: MMP values of the reservoir and MMP influential factor data of the reservoir.
Optionally, the MMP prediction method for a conditional generation-based countermeasure network further includes:
acquiring the training sample set;
and performing H1 times of iterative training according to the training sample set to obtain the pre-trained generator, wherein each time of iterative training is divided into multiple batches of training, when each batch of training is performed, H2 training samples are selected from the training sample set, then training is performed on the network weight of the discriminator based on the selected training samples, and finally training is performed on the network weight of the generator in a combined model composed of the discriminator and the generator based on the selected training samples, wherein H1 and H2 are positive integers.
Optionally, the training sample is composed of first data and second data, the first data is MMP influence factor data of the oil reservoir, and the second data is MMP value of the oil reservoir;
the training of the network weight of the discriminator based on the selected training sample specifically comprises:
respectively combining the MMP predicted value output by the generator according to the first data of the training sample with the first data of the training sample aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 0;
setting the label of each selected training sample to be 1;
and inputting the combined data after the label setting and the training sample after the label setting into a discriminator, and training the network weight of the discriminator.
Optionally, the training the network weight of the generator in the combined model composed of the discriminator and the generator based on the selected training sample specifically includes:
respectively inputting first data of the training samples into a generator aiming at each selected training sample to obtain an MMP predicted value corresponding to the training sample output by the generator;
respectively combining the first data of the training samples with the MMP predicted values corresponding to the training samples aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 1;
and inputting the combined data after the label setting into a discriminator to obtain the probability that the combined data output by the discriminator is real data.
Optionally, the performing H1 times of iterative training according to the training sample set to obtain the pre-trained generator includes:
and optimizing the iterative training times H1, the training sample number H2, the hyperparameters of the generator and the hyperparameters of the discriminator by using a hyperparameter optimization method to obtain the optimal parameter combination.
In order to achieve the above object, according to another aspect of the present invention, there is provided an MMP prediction apparatus for a conditional generation-based countermeasure network, the apparatus including:
the data acquisition unit is used for acquiring MMP (matrix metalloproteinases) influence factor data of the target oil reservoir;
a prediction unit, configured to input the MMP influence factor data into a pre-trained generator, so as to obtain an MMP prediction value of the target oil reservoir output by the pre-trained generator, where the generator is obtained by performing multiple iterative training according to a training sample set, and each training sample in the training sample set includes: MMP values of the reservoir and MMP influential factor data of the reservoir.
To achieve the above object, according to another aspect of the present invention, there is also provided a computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, the processor implementing the steps of the above MMP prediction method based on a condition generating countermeasure network when executing the computer program.
To achieve the above object, according to another aspect of the present invention, there is also provided a computer program product comprising computer program/instructions which, when executed by a processor, implement the steps of the above MMP prediction method based on a condition-generated countermeasure network.
The invention has the beneficial effects that:
the invention generates a condition type countermeasure network and CO2The method is combined with the prediction of minimum miscible phase pressure (MMP) among crude oil in the oil reservoir, a generator of a condition generating type countermeasure network is trained to serve as an MMP prediction model, and the beneficial effect of accurately and efficiently predicting the MMP in the oil reservoir can be achieved.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts. In the drawings:
fig. 1 is a first flowchart of an MMP prediction method for a conditional generation-based countermeasure network according to an embodiment of the present invention;
FIG. 2 is a second flowchart of an MMP prediction method for a conditionally generated countermeasure network in accordance with an embodiment of the present invention;
FIG. 3 is a flow chart of the training of the arbiter according to an embodiment of the present invention;
FIG. 4 is a flow chart of the training of a generator according to an embodiment of the present invention;
FIG. 5 is a schematic diagram of a training sample set according to an embodiment of the present invention;
FIG. 6 is a schematic diagram of a network architecture of a generator according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of a network structure of an arbiter according to an embodiment of the present invention;
FIG. 8 is a schematic view of a combined model according to an embodiment of the present invention;
fig. 9 is a first block diagram of an MMP prediction apparatus of an embodiment of the present invention for a conditional generation-based countermeasure network;
fig. 10 is a second block diagram of an MMP predicting apparatus of an embodiment of the present invention for countering a network based on condition generation;
FIG. 11 is a schematic diagram of a computer apparatus according to an embodiment of the present invention.
Detailed Description
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As will be appreciated by one skilled in the art, embodiments of the present invention may be provided as a method, system, or computer program product. Accordingly, the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present invention may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
It should be noted that the terms "comprises" and "comprising," and any variations thereof, in the description and claims of the present invention and the above-described drawings, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In this application, MMP refers to CO2Minimum miscible pressure with the reservoir crude oil.
The Generative Adaptive Networks (GANs) is a deep learning model, and is one of the most promising methods for unsupervised learning in complex distribution in recent years. The model passes through (at least) two modules in the framework: the mutual game learning of the generative model (also called a generator) and the discriminant model (also called a discriminant) produces quite good output.
The generative confrontation network is an unsupervised machine learning method, so that the method is generally applied to data enhancement, namely when few training samples are provided for machine learning, the generative confrontation network can be adopted to generate a plurality of data samples for the machine to learn. Since the generation type confrontation network is late in generation, it has been used for data enhancement by a few oil workers in recent years, but the effect of the data generated by the method is pragmatically inconsistent by each researcher.
The most primitive generative countermeasure network is to input a random vector and then get a generated object, but we cannot control what object is generated. Therefore, researchers put forward a conditional generative confrontation network theory, add constraints to original GAN, introduce conditional variable y (conditional variable y) into a generative model and a discriminant model, and introduce additional information for the model to generate data in an instructive manner. In theory y can make meaningful information, such as class labels, change GAN, an unsupervised learning method, into supervised.
The occurrence of the conditional generative countermeasure network changes the generative countermeasure network from unsupervised learning to supervised learning, which means that the method can be used for parameter prediction in the aspect of petroleum and has great application prospect in the aspect of petroleum. However, according to the research and development of the current literature, the method is not combined with the petroleum industry for a while. The invention combines a conditional generation type countermeasure network with MMP prediction, and provides CO based on the conditional generation type countermeasure network2A method for predicting Minimum Miscible Pressure (MMP) with crude oil. Through the countermeasure training of a conditional generation countermeasure network (CGAN), the nonlinear mapping relation between MMP and influence factors thereof is learned and established, the model structure is optimized by utilizing a Bayesian hyper-parameter optimization method, and the prediction precision of the model is comprehensively improved, so that the accurate prediction of MMP of an unknown oil reservoir is realized, a reliable basis is provided for the next development of the oil reservoir, and the efficient development of the oil reservoir is assisted.
Fig. 1 is a first flowchart of an MMP prediction method of an embodiment of the present invention based on a condition generating type countermeasure network, as shown in fig. 1, in an embodiment of the present invention, the MMP prediction method of the embodiment of the present invention based on the condition generating type countermeasure network includes steps S101 to S103.
And step S101, acquiring MMP influence factor data of the target oil reservoir.
In one embodiment of the invention, the MMP influencer data specifically includes: oil layer temperature (T)R) Mole fraction of volatile component in crude oil (X)vol) C in crude oil2-C4Component mole fraction (X)C2-4) C in crude oil5-C6Component mole fraction (X)C5-6) C in crude oil7+Component Molecular Weight (MW)C7+) CO in the injected gas2And four mole fractions of impurities (i.e., y)CO2、yC1、yN2、yH2SAnd yHC) And the like.
Step S102, inputting the MMP influencing factor data into a pre-trained generator to obtain the MMP predicted value of the target oil reservoir output by the pre-trained generator, wherein the generator is obtained by performing multiple iterative training according to a training sample set, and each training sample in the training sample set comprises: MMP values of the reservoir and MMP influential factor data of the reservoir.
Fig. 2 is a second flowchart of an MMP prediction method for generating an antagonistic network based on conditions according to an embodiment of the present invention, and as shown in fig. 2, the generator trained in advance in step S102 is specifically generated by training in step S201 and step S202.
Step S201, obtaining the training sample set.
In one embodiment of the invention, MMP values of a certain amount of existing oil reservoirs and corresponding MMP influence factor data are collected, and the MMP values and the corresponding MMP influence factor data are divided into a training sample set, a verification sample set and a test sample set according to a certain proportion.
FIG. 5 shows MMP values collected for 105 reservoirs and corresponding MMP influencing factor data for one embodiment of the present invention. Specifically, according to the invention, all data are divided into a training sample set, a verification sample set and a test sample set according to the ratio of 6:2:2, so that the training sample set comprises 63 groups of data, the verification sample set comprises 21 groups of data, and the test sample set comprises 21 groups of data.
In an embodiment of the present invention, after obtaining the training sample set, the verification sample set, and the test sample set, the present invention performs the maximum and minimum normalization processing on the training sample set, and then performs the same processing on the data in the verification sample set and the test sample set by using the maximum value and the minimum value of the data in the training sample set.
In one embodiment of the invention, the maximum and minimum normalized formula may be as follows:
Figure BDA0003399301330000061
and S202, performing H1 times of iterative training according to the training sample set to obtain the pre-trained generator, wherein each time of iterative training is divided into multiple batches of training, when each batch of training is performed, H2 training samples are selected from the training sample set, then training is performed on the network weight of the discriminator based on the selected training samples, and finally training is performed on the network weight of the generator in a combined model composed of the discriminator and the generator based on the selected training samples, wherein H1 and H2 are positive integers.
In the invention, each iterative training is divided into a plurality of batches of training, when each batch of training is carried out, H2 training samples are selected from a training sample set, the training samples selected for a plurality of batches of training of the same iterative training are different, and if the number of the remaining samples in the training sample set during the training of a certain batch is less than H2, all the remaining samples are selected to carry out the training of the certain batch. Therefore, after all samples in the training sample set are trained once, an iterative training process of the conditional generation type countermeasure network is realized, which is also called a training period, and the performances of the generator and the discriminator are gradually improved along with the increase of the iterative training times (H1).
In the invention, H1 iterative training is carried out according to the above process. When iterative training reaches a certain number of times, the MMP predicted value generated by the generator under the corresponding condition is very close to the real data, and the MMP prediction function is realized. In one embodiment of the invention, the prediction error of the generator on the verification set after each iterative training is finished is monitored simultaneously in the iterative training process, the generator after each iterative training is stored separately, and finally the generator with the minimum error on the verification set in the iterative training process is selected, namely the MMP prediction model.
In the invention, the network weight of the discriminator is trained based on the training sample, and then the network weight of the generator is trained in the combined model composed of the discriminator and the generator based on the training sample. Fig. 8 is a schematic diagram of a combination model according to an embodiment of the present invention, and as shown in fig. 8, when the network weight of the generator is trained in the combination model, the network weight of the discriminator does not change, and the network weight of the generator changes along with the training of data.
In one embodiment of the invention, the training sample is comprised of first data that is MMP contributor data for the reservoir and second data that is MMP values for the reservoir.
Fig. 3 is a training flowchart of the arbiter according to an embodiment of the present invention, and as shown in fig. 3, in an embodiment of the present invention, the training of the network weights of the arbiter based on the selected training samples in step S202 specifically includes steps S301 to S303.
Step S301, for each selected training sample, combining the MMP prediction value output by the generator according to the first data of the training sample with the first data of the training sample to obtain combined data, and setting the label of the combined data to 0.
Step S302, the label of each selected training sample is set to 1.
Step S303, inputting the combined data after the label setting and the training sample after the label setting into a discriminator, and training the network weight of the discriminator.
In an embodiment of the invention, when an iterative training is started, a first batch of training samples is selected from a training sample set, the number of the training samples in the batch is H2, an established generator is utilized, first data in H2 training samples in the batch is used as condition input of the generator, and an MMP predicted value under a corresponding condition generated by the generator for the first time is output; then, combining the first data with the MMP predicted value output by the generator to obtain combined data, setting the label of the combined data to be 0, and inputting the combined data into a discriminator; meanwhile, the combination of the first data and the corresponding real MMP value, namely training data, is set to be 1 and is also sent into the discriminator, so that the discriminator can learn for the first time, namely, the true and false data can be learned for the first time. According to the process, multiple batches of classifiers are subjected to H1 times of iterative training, and the classifiers subjected to iterative training can accurately identify real data.
Fig. 4 is a training flowchart of a generator according to an embodiment of the present invention, and as shown in fig. 4, in an embodiment of the present invention, the network weights of the generator are trained in a combined model composed of a discriminator and the generator based on the selected training samples in step S202, which specifically includes steps S401 to S403.
Step S401, respectively inputting the first data of the training sample into the generator for each selected training sample, and obtaining the MMP prediction value corresponding to the training sample output by the generator.
Step S402, aiming at each selected training sample, combining the first data of the training sample with the MMP predicted value corresponding to the training sample to obtain combined data, and setting the label of the combined data to be 1.
Step S403, inputting the combined data after the label setting into the discriminator to obtain the probability that the combined data output by the discriminator is real data.
In the present invention, after training the discriminators, the weights of the discriminators are kept unchanged, and further the network weights of the generators are trained in a combined model composed of the discriminators and the generators. Specifically, the invention combines the condition input of the generator (namely the first data in the training data) with the MMP predicted value output by the generator, sets the label as 1, inputs the label into the combined model, realizes the independent training and learning of the generator without influencing the discriminator, and improves the accuracy of the generator for generating the data.
FIG. 6 is a schematic diagram of a network structure of a generator according to an embodiment of the present invention, where the input of the generator is MMP influencing factor data, and the output of the generator is an MMP predicted value; the network structure of the generator specifically includes: the first FCNN layer is used for preprocessing MMP influence factor data, the first splicing layer is used for splicing the preprocessed data output by the first FCNN layer with random noise, and the second FCNN layer is used for processing the spliced data output by the first splicing layer and outputting MMP predicted values.
As shown in fig. 6, the present invention constructs a generator in a conditional generative countermeasure network using a Fully Connected Neural Network (FCNN). As shown in fig. 6, the present invention first sets an FCNN layer (i.e., a first FCNN layer) to pre-process condition X, i.e., normalized MMP influence factor data; then, splicing the data preprocessed by the FCNN layer (namely the first FCNN layer) with the random noise data Z generated by the generator; finally, the FCNN layer (i.e., the second FCNN layer) is set again to process the spliced data, so as to output the MMP prediction value under the current influence factor, i.e., Y'. In the present invention, the actual MMP value, i.e., the MMP value in the training data, is represented by Y, and the MMP prediction value output by the generator is represented by Y'.
In an embodiment of the present invention, the hyper-parameters of the generator specifically include: the number of layers of the first FCNN layer, the number of neurons in each layer of the first FCNN layer, the discarding rate of each layer of the first FCNN layer, the number of layers of the second FCNN layer, the number of neurons in each layer of the second FCNN layer, the discarding rate of each layer of the second FCNN layer, and the initial learning rate of the generator.
In one embodiment of the invention, the initial learning rate of the generator is specifically the initial learning rate of the Adam optimizer in the generator.
Fig. 7 is a schematic diagram of a network structure of a discriminator according to an embodiment of the present invention, where as shown in fig. 7, the input of the discriminator is MMP influencing factor data and MMP values, the MMP values include the output of the generator being MMP predicted values, and the output of the discriminator is the probability that the data is real data; the network structure of the discriminator specifically includes: the third FCNN layer is used for preprocessing MMP influence factor data, the second splicing layer is used for splicing the preprocessed data output by the third FCNN layer with an MMP value, and the fourth FCNN layer is used for processing the probability that the output data is real data according to the spliced data output by the second splicing layer.
As shown in fig. 7, the discriminator receives two inputs, the first input being a condition X, i.e., normalized MMP influence factor data, for preprocessing, and the second being a true MMP value Y corresponding to the condition X or an MMP predicted value Y' generated by the generator under the condition X.
As shown in fig. 7, in an embodiment of the present invention, when the arbiter is configured, the FCNN layer (i.e., the third FCNN layer) is first configured to perform preprocessing on condition X, i.e., the normalized MMP influence factor data; then, splicing the data preprocessed by the FCNN layer (i.e. the third FCNN layer) with the input MMP value (the real MMP value Y corresponding to the condition X or the MMP predicted value Y' generated by the generator under the condition X); finally, the FCNN layer (i.e., the fourth FCNN layer) is set again to process the spliced data, and the number of neurons in the last layer of the FCNN layer (i.e., the fourth FCNN layer) is 1, the activation function is sigmoid, the probability that the currently input data is real data is output, if the output probability is greater than 0.5, the data belongs to the real data, otherwise, the data is false data, i.e., the data generated by the generator.
In an embodiment of the present invention, the hyper-parameters of the discriminator specifically include: the number of layers of the third FCNN layer, the number of neurons in each layer of the third FCNN layer, the discarding rate of each layer of the third FCNN layer, the number of layers of the fourth FCNN layer, the number of neurons in each layer of the fourth FCNN layer, the discarding rate of each layer of the fourth FCNN layer, and the initial learning rate of the discriminator.
In one embodiment of the invention, the initial learning rate of the arbiter is specifically the initial learning rate of the Adam optimizer in the generator.
In an embodiment of the present invention, when performing iterative training in step S202, the present invention further optimizes the iterative training times H1, the number of training samples H2, the hyper-parameters of the generator, and the hyper-parameters of the discriminator by using a hyper-parameter optimization method, so as to obtain an optimal parameter combination. And then carrying out model iterative training based on the optimal parameter combination to obtain a generator, namely an MMP prediction model.
In a specific embodiment of the invention, the invention utilizes a Bayesian hyper-parameter optimization method to optimize the hyper-parameters of the generator and the discriminator, the training sample number (H2) in each batch in each iterative training process and the iterative training times (H1), and searches the parameter combination which enables the model to have the best prediction effect in the verification set, and takes the parameter combination with the best prediction effect in the verification set as the best parameter combination.
During Bayesian optimization, a plurality of groups of parameter combinations are randomly used for trial calculation, the trial calculation times can be set manually, and the trial calculation times are set to be 10. Then, the real bayesian optimization is performed, and each bayesian optimization after the trial calculation is finished refers to the result of the previous calculation, that is, the expression of the model on the verification set, so as to select the hyper-parameters and the iterative training times which should be used in the next calculation, wherein the step number of the real bayesian optimization is set to be 90.
In one embodiment of the invention, the invention builds MMP prediction models (i.e., generators) using the best parameter combinations obtained by bayesian hyperparametric optimization. The model is a model which is used for predicting the MMP of a new oil reservoir, and the corresponding MMP value can be predicted by inputting MMP influence factor data of the new oil reservoir.
In one embodiment of the present invention, the optimal parameter combination obtained by using bayesian hyperparametric optimization may be as follows:
in a generator, an FCNN layer (i.e. a first FCNN layer) for preprocessing an input influencing factor X is set to be 2 layers, the number of neurons in the first layer is set to be 430, the discarding rate is 0.1943, the number of neurons in the second layer is set to be 1, and activation functions are all relu; the FCNN layer (namely the second FCNN layer) for processing the spliced data is set to be 5 layers, the number of neurons in the first four layers is set to be 381, the discarding rate of each layer of the first four layers is 0.2944, the number of neurons in the fifth layer is set to be 1, and the live functions are relu. The initial learning rate of the Adam optimizer in the generator is set to 0.0001960.
In the discriminator, an FCNN layer (namely, a third FCNN layer) for preprocessing an input influence factor X is set to be 2 layers, the number of neurons in a first layer is set to be 247, the discarding rate is 0.0123, the number of neurons in a second layer is set to be 1, and activation functions are relu; the FCNN setting (i.e., the fourth FCNN layer) for processing the spliced data is 4 layers, the number of neurons in each layer in the first three layers is set to 110, the discarding rate of each layer in the first three layers is 0.3779, the number of neurons in the last layer is set to 1, the activation functions in the first three layers are all relu, and the activation function in the last 1 layer is sigmoid. The initial learning rate of the Adam optimizer in the discrimination network is set to 0.0001758.
The iterative training times (H1) are set to 890 after being subjected to Bayesian optimization, and the training sample number (H2) in each batch in each iterative training process is set to 38 after being subjected to Bayesian optimization.
According to the same training sample set, an MMP prediction model is established by respectively utilizing 3 machine learning methods of FCNN, RF and SVM, and the structure of each model is optimized by combining Bayesian algorithm and verification set data. And finally, evaluating the prediction precision of each optimized model by using the same untrained test sample set.
TABLE 1 prediction mean absolute percent error
Figure BDA0003399301330000111
As can be seen from table 1, compared with the prediction results of the FCNN, RF, SVM models, the MMP prediction model based on CGAN of the present invention has 1, 3, and 8 percentage points of error in the test set, which is the highest precision among the four machine learning methods. The method embodies the strong fitting capability of the CGAN, has higher prediction accuracy than FCNN, RF and SVM, and is mutually verified with good performances reflected by the CGAN in other fields.
As can be seen from the above embodiments, the MMP prediction method based on the conditional generation type countermeasure network of the present invention achieves at least the following beneficial effects:
1. the method combines the condition generating type countermeasure network with the oil reservoir MMP prediction for the first time, utilizes a large amount of oil reservoir MMP data to learn, establishes a data-driven oil reservoir MMP prediction model, improves the model prediction precision, is a new MMP prediction idea and method, opens up a precedent of the condition generating type countermeasure network in the MMP prediction, and has important significance for the oil reservoir MMP prediction and the design of an oil reservoir development scheme.
2. In the modeling process, the method optimizes the hyper-parameters in the network by using a Bayesian optimization method, thereby comprehensively improving the model prediction precision.
3. The method has the advantages of simple and convenient model establishing process, high calculation efficiency, high prediction precision, high comprehensiveness and applicability, lays a certain foundation for the large-scale application of machine learning and condition generating networks in oil reservoir MMP prediction, and has wide application prospect.
It should be noted that the steps illustrated in the flowcharts of the figures may be performed in a computer system such as a set of computer-executable instructions and that, although a logical order is illustrated in the flowcharts, in some cases, the steps illustrated or described may be performed in an order different than presented herein.
Based on the same inventive concept, the embodiment of the present invention further provides an MMP prediction apparatus based on a conditional generation-based countermeasure network, which can be used to implement the MMP prediction method based on the conditional generation-based countermeasure network described in the foregoing embodiment, as described in the following embodiment. Since the principle of solving the problem of the MMP prediction apparatus based on the conditional generation-type countermeasure network is similar to that of the MMP prediction method based on the conditional generation-type countermeasure network, the embodiment of the MMP prediction apparatus based on the conditional generation-type countermeasure network can be referred to the embodiment of the MMP prediction method based on the conditional generation-type countermeasure network, and the repeated details are not repeated. As used hereinafter, the term "unit" or "module" may be a combination of software and/or hardware that implements a predetermined function. Although the means described in the embodiments below are preferably implemented in software, an implementation in hardware, or a combination of software and hardware is also possible and contemplated.
Fig. 9 is a first block diagram of an MMP predicting apparatus of an embodiment of the present invention based on a conditional generation-based countermeasure network, and as shown in fig. 9, in an embodiment of the present invention, the MMP predicting apparatus of an embodiment of the present invention based on a conditional generation-based countermeasure network includes:
the data acquisition unit 1 is used for acquiring MMP (matrix metalloproteinases) influence factor data of a target oil reservoir;
the predicting unit 2 is configured to input the MMP influence factor data into a pre-trained generator, and obtain an MMP prediction value of the target oil reservoir output by the pre-trained generator, where the generator is obtained by performing multiple iterative training according to a training sample set, and each training sample in the training sample set includes: MMP values of the reservoir and MMP influential factor data of the reservoir.
Fig. 10 is a second block diagram of the MMP predicting apparatus of the embodiment of the present invention based on the conditional generation countermeasure network, and as shown in fig. 10, in an embodiment of the present invention, the MMP predicting apparatus of the embodiment of the present invention based on the conditional generation countermeasure network further includes:
a training sample set obtaining unit 3, configured to obtain the training sample set;
and the model training unit 4 is used for performing H1 times of iterative training according to the training sample set to obtain the pre-trained generator, wherein each time of iterative training is divided into multiple batches of training, when each batch of training is performed, H2 training samples are selected from the training sample set, then the network weight of the discriminator is trained based on the selected training samples, and finally the network weight of the generator is trained in a combined model composed of the discriminator and the generator based on the selected training samples, wherein H1 and H2 are positive integers.
In one embodiment of the invention, the training sample is comprised of first data that is MMP contributor data for the reservoir and second data that is MMP values for the reservoir. In an embodiment of the present invention, the model training unit specifically includes:
the first label setting module is used for combining the MMP predicted value output by the generator according to the first data of the training sample with the first data of the training sample respectively aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 0;
the second label setting module is used for setting the label of each selected training sample to be 1;
and the discriminator training module is used for inputting the combined data after the label setting and the training sample after the label setting into the discriminator and training the network weight of the discriminator.
In an embodiment of the present invention, the model training unit specifically includes:
the predicted value acquisition module is used for inputting the first data of the training samples into the generator respectively aiming at each selected training sample to obtain the MMP predicted value corresponding to the training sample output by the generator;
the combined data acquisition module is used for combining the first data of the training samples with the MMP predicted values corresponding to the training samples respectively aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 1;
and the generator training module is used for inputting the combined data after the label setting into the discriminator to obtain the probability that the combined data output by the discriminator is real data.
Optionally, the model training unit further includes:
and the hyper-parameter optimization module is used for optimizing the iterative training times H1, the training sample number H2, the hyper-parameters of the generator and the hyper-parameters of the discriminator by adopting a hyper-parameter optimization method to obtain the optimal parameter combination.
To achieve the above object, according to another aspect of the present application, there is also provided a computer apparatus. As shown in fig. 11, the computer device comprises a memory, a processor, a communication interface and a communication bus, wherein a computer program that can be run on the processor is stored in the memory, and the steps of the method of the embodiment are realized when the processor executes the computer program.
The processor may be a Central Processing Unit (CPU). The Processor may also be other general purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, or a combination thereof.
The memory, which is a non-transitory computer readable storage medium, may be used to store non-transitory software programs, non-transitory computer executable programs, and units, such as the corresponding program units in the above-described method embodiments of the present invention. The processor executes various functional applications of the processor and the processing of the work data by executing the non-transitory software programs, instructions and modules stored in the memory, that is, the method in the above method embodiment is realized.
The memory may include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required for at least one function; the storage data area may store data created by the processor, and the like. Further, the memory may include high speed random access memory, and may also include non-transitory memory, such as at least one disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory located remotely from the processor, and such remote memory may be coupled to the processor via a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The one or more units are stored in the memory and when executed by the processor perform the method of the above embodiments.
The specific details of the computer device may be understood by referring to the corresponding related descriptions and effects in the above embodiments, and are not described herein again.
In order to achieve the above object, according to another aspect of the present application, there is also provided a computer-readable storage medium storing a computer program which, when executed in a computer processor, implements the steps in the above MMP prediction method based on a conditional generation-based countermeasure network. It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic Disk, an optical Disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), a Flash Memory (Flash Memory), a Hard Disk (Hard Disk Drive, abbreviated as HDD) or a Solid State Drive (SSD), etc.; the storage medium may also comprise a combination of memories of the kind described above.
To achieve the above object, according to another aspect of the present application, there is also provided a computer program product comprising computer program/instructions which, when executed by a processor, implement the steps of the above MMP prediction method based on a condition generating-type countermeasure network.
It will be apparent to those skilled in the art that the modules or steps of the present invention described above may be implemented by a general purpose computing device, they may be centralized on a single computing device or distributed across a network of multiple computing devices, and they may alternatively be implemented by program code executable by a computing device, such that they may be stored in a storage device and executed by a computing device, or fabricated separately as individual integrated circuit modules, or fabricated as a single integrated circuit module from multiple modules or steps. Thus, the present invention is not limited to any specific combination of hardware and software.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (10)

1. An MMP prediction method for a conditional generation-based countermeasure network, comprising:
acquiring MMP influence factor data of a target oil reservoir;
inputting the MMP influencing factor data into a pre-trained generator to obtain an MMP predicted value of the target oil reservoir output by the pre-trained generator, wherein the generator is obtained by carrying out multiple iterative training according to a training sample set, and each training sample in the training sample set comprises: MMP values of the reservoir and MMP influential factor data of the reservoir.
2. The MMP prediction method for a conditional generation-based countermeasure network according to claim 1, further comprising:
acquiring the training sample set;
and performing H1 times of iterative training according to the training sample set to obtain the pre-trained generator, wherein each time of iterative training is divided into multiple batches of training, when each batch of training is performed, H2 training samples are selected from the training sample set, then training is performed on the network weight of the discriminator based on the selected training samples, and finally training is performed on the network weight of the generator in a combined model composed of the discriminator and the generator based on the selected training samples, wherein H1 and H2 are positive integers.
3. The MMP prediction method based on the conditional generation countermeasure network of claim 2, wherein the training sample is composed of a first data and a second data, the first data is MMP influence factor data of the oil reservoir, the second data is MMP value of the oil reservoir;
the training of the network weight of the discriminator based on the selected training sample specifically comprises:
respectively combining the MMP predicted value output by the generator according to the first data of the training sample with the first data of the training sample aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 0;
setting the label of each selected training sample to be 1;
and inputting the combined data after the label setting and the training sample after the label setting into a discriminator, and training the network weight of the discriminator.
4. The MMP prediction method based on conditional generative confrontation network of claim 3, wherein the training of network weights of generators in the combined model composed of arbiter and generator based on the selected training samples comprises:
respectively inputting first data of the training samples into a generator aiming at each selected training sample to obtain an MMP predicted value corresponding to the training sample output by the generator;
respectively combining the first data of the training samples with the MMP predicted values corresponding to the training samples aiming at each selected training sample to obtain combined data, and setting the label of the combined data to be 1;
and inputting the combined data after the label setting into a discriminator to obtain the probability that the combined data output by the discriminator is real data.
5. The MMP prediction method based on conditional generative confrontation network of claim 2, wherein said H1 iterative training is performed according to said training sample set to obtain said pre-trained generator, comprising:
and optimizing the iterative training times H1, the training sample number H2, the hyperparameters of the generator and the hyperparameters of the discriminator by using a hyperparameter optimization method to obtain the optimal parameter combination.
6. The MMP prediction method of the conditional generation-based countermeasure network of claim 2, wherein an input of the generator is MMP influence factor data, and an output of the generator is an MMP prediction value; the network structure of the generator specifically includes: the first FCNN layer is used for preprocessing MMP influence factor data, the first splicing layer is used for splicing the preprocessed data output by the first FCNN layer with random noise, and the second FCNN layer is used for processing the spliced data output by the first splicing layer and outputting MMP predicted values.
7. The MMP prediction method of the countermeasure network based on the condition generating equation according to claim 2, wherein the input of the discriminator is MMP influence factor data and MMP values, the MMP values including the output of the generator being MMP predicted values, the output of the discriminator being a probability that the data is real data; the network structure of the discriminator specifically includes: the third FCNN layer is used for preprocessing MMP influence factor data, the second splicing layer is used for splicing the preprocessed data output by the third FCNN layer with an MMP value, and the fourth FCNN layer is used for processing the probability that the output data is real data according to the spliced data output by the second splicing layer.
8. An MMP prediction apparatus for a conditional generation-based countermeasure network, comprising:
the data acquisition unit is used for acquiring MMP (matrix metalloproteinases) influence factor data of the target oil reservoir;
a prediction unit, configured to input the MMP influence factor data into a pre-trained generator, so as to obtain an MMP prediction value of the target oil reservoir output by the pre-trained generator, where the generator is obtained by performing multiple iterative training according to a training sample set, and each training sample in the training sample set includes: MMP values of the reservoir and MMP influential factor data of the reservoir.
9. A computer device comprising a memory, a processor and a computer program stored on the memory and executable on the processor, characterized in that the steps of the method according to any of claims 1 to 7 are implemented when the computer program is executed by the processor.
10. A computer program product comprising computer program/instructions, characterized in that the computer program/instructions, when executed by a processor, implement the steps of the method of any one of claims 1 to 7.
CN202111493520.7A 2021-12-08 2021-12-08 MMP prediction method and device based on conditional generative adversarial network Active CN114169240B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111493520.7A CN114169240B (en) 2021-12-08 2021-12-08 MMP prediction method and device based on conditional generative adversarial network

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111493520.7A CN114169240B (en) 2021-12-08 2021-12-08 MMP prediction method and device based on conditional generative adversarial network

Publications (2)

Publication Number Publication Date
CN114169240A true CN114169240A (en) 2022-03-11
CN114169240B CN114169240B (en) 2024-09-17

Family

ID=80484473

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111493520.7A Active CN114169240B (en) 2021-12-08 2021-12-08 MMP prediction method and device based on conditional generative adversarial network

Country Status (1)

Country Link
CN (1) CN114169240B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109359A (en) * 2019-05-21 2019-08-09 中国石油大学(华东) A kind of Safety Integrity Levels appraisal procedure of offshore oil well control equipment
US20250277780A1 (en) * 2024-03-01 2025-09-04 China University Of Petroleum (Beijing) Minimum miscible pressure prediction method for co2-crude oil system considering reservoir well spacing

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021007812A1 (en) * 2019-07-17 2021-01-21 深圳大学 Deep neural network hyperparameter optimization method, electronic device and storage medium
CN113435128A (en) * 2021-07-15 2021-09-24 中国石油大学(北京) Oil and gas reservoir yield prediction method and device based on condition generation type countermeasure network
WO2021197223A1 (en) * 2020-11-13 2021-10-07 平安科技(深圳)有限公司 Model compression method, system, terminal, and storage medium
CN113537592A (en) * 2021-07-15 2021-10-22 中国石油大学(北京) Oil and gas reservoir yield prediction method and device based on long-time and short-time memory network

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2021007812A1 (en) * 2019-07-17 2021-01-21 深圳大学 Deep neural network hyperparameter optimization method, electronic device and storage medium
WO2021197223A1 (en) * 2020-11-13 2021-10-07 平安科技(深圳)有限公司 Model compression method, system, terminal, and storage medium
CN113435128A (en) * 2021-07-15 2021-09-24 中国石油大学(北京) Oil and gas reservoir yield prediction method and device based on condition generation type countermeasure network
CN113537592A (en) * 2021-07-15 2021-10-22 中国石油大学(北京) Oil and gas reservoir yield prediction method and device based on long-time and short-time memory network

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110109359A (en) * 2019-05-21 2019-08-09 中国石油大学(华东) A kind of Safety Integrity Levels appraisal procedure of offshore oil well control equipment
CN110109359B (en) * 2019-05-21 2023-03-10 中国石油大学(华东) Safety integrity level evaluation method for offshore oil well control equipment
US20250277780A1 (en) * 2024-03-01 2025-09-04 China University Of Petroleum (Beijing) Minimum miscible pressure prediction method for co2-crude oil system considering reservoir well spacing

Also Published As

Publication number Publication date
CN114169240B (en) 2024-09-17

Similar Documents

Publication Publication Date Title
Sangiorgio et al. Robustness of LSTM neural networks for multi-step forecasting of chaotic time series
Gao et al. Predicting human mobility via variational attention
EP4261749A1 (en) Automated creation of tiny deep learning models based on multi-objective reward function
CN114372526B (en) Data recovery method, system, computer equipment and storage medium
CN113435128B (en) Oil and gas reservoir production prediction method and device based on conditional generative adversarial network
CN114169240A (en) MMP prediction method and device based on conditional generative adversarial network
Kumar et al. Wind speed prediction using deep learning-LSTM and GRU
Spiliotis Time series forecasting with statistical, machine learning, and deep learning methods: Past, present, and future
Ni et al. The close relationship between contrastive learning and meta-learning
CN114399121A (en) MMP prediction method and device based on random forest algorithm
Liu et al. Deep Boltzmann machines aided design based on genetic algorithms
Li et al. First-order sensitivity analysis for hidden neuron selection in layer-wise training of networks
Agiollo et al. Shallow2Deep: Restraining neural networks opacity through neural architecture search
WO2022142026A1 (en) Classification network construction method, and classification method based on classification network
Sood et al. Neunets: An automated synthesis engine for neural network design
Chen et al. Relace: Reinforcement learning agent for counterfactual explanations of arbitrary predictive models
CN114399119B (en) MMP prediction method and device based on condition convolution generation type countermeasure network
Friede et al. A variational-sequential graph autoencoder for neural architecture performance prediction
CN117273155B (en) Graph maximum segmentation method and system based on quantum approximation optimization algorithm
Song et al. Mutual information dropout: Mutual information can be all you need
Satapathy et al. Unsupervised feature selection using rough set and teaching learning-based optimisation
Li et al. Umformer: a transformer dedicated to univariate multistep prediction
Yan et al. A fast evolutionary algorithm for combinatorial optimization problems
Cai et al. ALGNN: Auto-designed lightweight graph neural network
CN114399120B (en) MMP prediction method and device based on convolutional neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant
点击 这是indexloc提供的php浏览器服务,不要输入任何密码和下载