No Cover Image

Journal article 838 views 1334 downloads

BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer

Ameer Hamza Khan, Xinwei Cao, Shuai Li Orcid Logo, Vasilios N. Katsikis, Liefa Liao

IEEE/CAA Journal of Automatica Sinica, Volume: 7, Issue: 2, Pages: 461 - 471

Swansea University Author: Shuai Li Orcid Logo

Abstract

In this paper, we propose enhancements to Beetle Antennae search ( BAS ) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the a...

Full description

Published in: IEEE/CAA Journal of Automatica Sinica
ISSN: 2329-9266 2329-9274
Published: Institute of Electrical and Electronics Engineers (IEEE) 2020
Online Access: Check full text

URI: https://cronfa.swan.ac.uk/Record/cronfa53871
Tags: Add Tag
No Tags, Be the first to tag this record!
first_indexed 2020-03-27T13:35:19Z
last_indexed 2020-07-06T19:16:35Z
id cronfa53871
recordtype SURis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2020-07-06T17:34:20.1605859</datestamp><bib-version>v2</bib-version><id>53871</id><entry>2020-03-27</entry><title>BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer</title><swanseaauthors><author><sid>42ff9eed09bcd109fbbe484a0f99a8a8</sid><ORCID>0000-0001-8316-5289</ORCID><firstname>Shuai</firstname><surname>Li</surname><name>Shuai Li</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2020-03-27</date><deptcode>MECH</deptcode><abstract>In this paper, we propose enhancements to Beetle Antennae search ( BAS ) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation ( ADAM ) update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer ( PSO ) and the original BAS algorithm.</abstract><type>Journal Article</type><journal>IEEE/CAA Journal of Automatica Sinica</journal><volume>7</volume><journalNumber>2</journalNumber><paginationStart>461</paginationStart><paginationEnd>471</paginationEnd><publisher>Institute of Electrical and Electronics Engineers (IEEE)</publisher><issnPrint>2329-9266</issnPrint><issnElectronic>2329-9274</issnElectronic><keywords/><publishedDay>1</publishedDay><publishedMonth>3</publishedMonth><publishedYear>2020</publishedYear><publishedDate>2020-03-01</publishedDate><doi>10.1109/jas.2020.1003048</doi><url/><notes/><college>COLLEGE NANME</college><department>Mechanical Engineering</department><CollegeCode>COLLEGE CODE</CollegeCode><DepartmentCode>MECH</DepartmentCode><institution>Swansea University</institution><apcterm/><lastEdited>2020-07-06T17:34:20.1605859</lastEdited><Created>2020-03-27T09:08:15.2843762</Created><authors><author><firstname>Ameer Hamza</firstname><surname>Khan</surname><order>1</order></author><author><firstname>Xinwei</firstname><surname>Cao</surname><order>2</order></author><author><firstname>Shuai</firstname><surname>Li</surname><orcid>0000-0001-8316-5289</orcid><order>3</order></author><author><firstname>Vasilios N.</firstname><surname>Katsikis</surname><order>4</order></author><author><firstname>Liefa</firstname><surname>Liao</surname><order>5</order></author></authors><documents><document><filename>53871__16978__2121faacfda04415ae70a29dca860e78.pdf</filename><originalFilename>53871.pdf</originalFilename><uploaded>2020-03-30T15:41:21.6925774</uploaded><type>Output</type><contentLength>2843325</contentLength><contentType>application/pdf</contentType><version>Accepted Manuscript</version><cronfaStatus>true</cronfaStatus><copyrightCorrect>true</copyrightCorrect><language>eng</language></document></documents><OutputDurs/></rfc1807>
spelling 2020-07-06T17:34:20.1605859 v2 53871 2020-03-27 BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer 42ff9eed09bcd109fbbe484a0f99a8a8 0000-0001-8316-5289 Shuai Li Shuai Li true false 2020-03-27 MECH In this paper, we propose enhancements to Beetle Antennae search ( BAS ) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation ( ADAM ) update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer ( PSO ) and the original BAS algorithm. Journal Article IEEE/CAA Journal of Automatica Sinica 7 2 461 471 Institute of Electrical and Electronics Engineers (IEEE) 2329-9266 2329-9274 1 3 2020 2020-03-01 10.1109/jas.2020.1003048 COLLEGE NANME Mechanical Engineering COLLEGE CODE MECH Swansea University 2020-07-06T17:34:20.1605859 2020-03-27T09:08:15.2843762 Ameer Hamza Khan 1 Xinwei Cao 2 Shuai Li 0000-0001-8316-5289 3 Vasilios N. Katsikis 4 Liefa Liao 5 53871__16978__2121faacfda04415ae70a29dca860e78.pdf 53871.pdf 2020-03-30T15:41:21.6925774 Output 2843325 application/pdf Accepted Manuscript true true eng
title BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
spellingShingle BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
Shuai Li
title_short BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
title_full BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
title_fullStr BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
title_full_unstemmed BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
title_sort BAS-ADAM: an ADAM based approach to improve the performance of beetle antennae search optimizer
author_id_str_mv 42ff9eed09bcd109fbbe484a0f99a8a8
author_id_fullname_str_mv 42ff9eed09bcd109fbbe484a0f99a8a8_***_Shuai Li
author Shuai Li
author2 Ameer Hamza Khan
Xinwei Cao
Shuai Li
Vasilios N. Katsikis
Liefa Liao
format Journal article
container_title IEEE/CAA Journal of Automatica Sinica
container_volume 7
container_issue 2
container_start_page 461
publishDate 2020
institution Swansea University
issn 2329-9266
2329-9274
doi_str_mv 10.1109/jas.2020.1003048
publisher Institute of Electrical and Electronics Engineers (IEEE)
document_store_str 1
active_str 0
description In this paper, we propose enhancements to Beetle Antennae search ( BAS ) algorithm, called BAS-ADAM, to smoothen the convergence behavior and avoid trapping in local-minima for a highly non-convex objective function. We achieve this by adaptively adjusting the step-size in each iteration using the adaptive moment estimation ( ADAM ) update rule. The proposed algorithm also increases the convergence rate in a narrow valley. A key feature of the ADAM update rule is the ability to adjust the step-size for each dimension separately instead of using the same step-size. Since ADAM is traditionally used with gradient-based optimization algorithms, therefore we first propose a gradient estimation model without the need to differentiate the objective function. Resultantly, it demonstrates excellent performance and fast convergence rate in searching for the optimum of non-convex functions. The efficiency of the proposed algorithm was tested on three different benchmark problems, including the training of a high-dimensional neural network. The performance is compared with particle swarm optimizer ( PSO ) and the original BAS algorithm.
published_date 2020-03-01T04:07:04Z
_version_ 1763753511698300928
score 11.036334