引用本文:
【打印本页】   【HTML】   【下载PDF全文】   查看/发表评论  【EndNote】   【RefMan】   【BibTex】
←前一篇|后一篇→ 过刊浏览    高级检索
本文已被:浏览 45次   下载 98 本文二维码信息
码上扫一扫!
分享到: 微信 更多
基于结合多任务学习的深度神经网络求解延迟积分微分方程
王辰尧,史峰
作者单位
王辰尧 哈尔滨工业大学(深圳) 理学院, 广东 深圳 518055 
史峰 哈尔滨工业大学(深圳) 理学院, 广东 深圳 518055 
摘要:
深度神经网络(DNNs)在求解非线性偏微分方程(PDEs)的正问题反问题方面都是有效的.然而,传统DNN方法在处理具有常延迟的延迟微分方程(DDEs)和延迟积分微分方程(DIDE)等问题时往往遇到困难,主要因为这些方程在由延迟产生的断点处的正则性较低.本文提出了一种结合多任务学习(MTL)的DNN方法来求解DIDE的正问题和反问题.这种方法的核心思想是根据延迟将原始方程划分为多个任务,使用辅助输出表示积分项,然后结合MTL将断点处的性质无缝纳入到损失函数中.此外,鉴于多个任务和输出引起的训练难度增加,我们采用顺序训练方案来降低训练复杂性,并为后续任务提供参考解.与传统DNN方法相比,这种方法显著提高了使用DNN方法求解DIDE的精度.我们通过几个数值算例验证了该方法的有效性,测试了MTL中的多种参数共享结构,并比较了这些结构的效果.最后,将该方法应用于求解非线性DIDE的反问题,结果表明,此方法可以从稀疏或有噪声的数据中发现DIDE的未知参数.
关键词:  延迟积分微分方程  多任务学习  参数共享结构  深度神经网络  顺序训练方案
DOI:
分类号:O242.2
基金项目:
DEEP NEURAL NETWORKS COMBINING MULTI-TASK LEARNING FOR SOLVING DELAY INTEGRO-DIFFERENTIAL EQUATIONS
WANG Chen-yao,SHI Feng
Abstract:
Deep neural networks (DNNs) are effective in solving both forward and inverse problems for nonlinear partial differential equations (PDEs). However, conventional DNNs are not effective in handling problems such as delay differential equations (DDEs) and delay integro-differential equations (DIDEs) with constant delays, primarily due to their low regularity at delay-induced breaking points. In this paper, a DNN method that combines multi-task learning (MTL) which is proposed to solve both the forward and inverse problems of DIDEs. The core idea of this approach is to divide the original equation into multiple tasks based on the delay, using auxiliary outputs to represent the integral terms, followed by the use of MTL to seamlessly incorporate the properties at the breaking points into the loss function. Furthermore, given the increased training difficulty associated with multiple tasks and outputs, we employ a sequential training scheme to reduce training complexity and provide reference solutions for subsequent tasks. This approach significantly enhances the approximation accuracy of solving DIDEs with DNNs, as demonstrated by comparisons with traditional DNN methods. We validate the effectiveness of this method through several numerical experiments, test various parameter sharing structures in MTL and compare the testing results of these structures. Finally, this method is implemented to solve the inverse problem of nonlinear DIDE and the results show that the unknown parameters of DIDE can be discovered with sparse or noisy data.
Key words:  Delay integro-differential equation  Multi-task learning  parameter sharing structure  deep neural network  sequential training scheme

美女图片

美女 美女美女 美女美女