From patchwork Wed Jan 11 14:23:57 2023 Content-Type: text/plain; charset="utf-8" MIME-Version: 1.0 Content-Transfer-Encoding: 8bit X-Patchwork-Submitter: Stamatis Markianos-Wright X-Patchwork-Id: 42027 Return-Path: Delivered-To: ouuuleilei@gmail.com Received: by 2002:a5d:4e01:0:0:0:0:0 with SMTP id p1csp3350769wrt; Wed, 11 Jan 2023 06:25:53 -0800 (PST) X-Google-Smtp-Source: AMrXdXtXZWTQZepubfGwhrC3OO5wGXhEUmg3oiboq3B56RMz9OMx9ykvGgCIzzBlJpJe03dIwNj/ X-Received: by 2002:a17:907:6e16:b0:7c0:9f6f:6d8 with SMTP id sd22-20020a1709076e1600b007c09f6f06d8mr81234799ejc.2.1673447153372; Wed, 11 Jan 2023 06:25:53 -0800 (PST) Received: from sourceware.org (server2.sourceware.org. [2620:52:3:1:0:246e:9693:128c]) by mx.google.com with ESMTPS id hv22-20020a17090760d600b007b5911c9b13si15877916ejc.831.2023.01.11.06.25.52 for (version=TLS1_3 cipher=TLS_AES_256_GCM_SHA384 bits=256/256); Wed, 11 Jan 2023 06:25:53 -0800 (PST) Received-SPF: pass (google.com: domain of gcc-patches-bounces+ouuuleilei=gmail.com@gcc.gnu.org designates 2620:52:3:1:0:246e:9693:128c as permitted sender) client-ip=2620:52:3:1:0:246e:9693:128c; Authentication-Results: mx.google.com; dkim=pass header.i=@gcc.gnu.org header.s=default header.b=l2tdnd9f; arc=fail (signature failed); spf=pass (google.com: domain of gcc-patches-bounces+ouuuleilei=gmail.com@gcc.gnu.org designates 2620:52:3:1:0:246e:9693:128c as permitted sender) smtp.mailfrom="gcc-patches-bounces+ouuuleilei=gmail.com@gcc.gnu.org"; dmarc=pass (p=NONE sp=NONE dis=NONE) header.from=gnu.org Received: from server2.sourceware.org (localhost [IPv6:::1]) by sourceware.org (Postfix) with ESMTP id DCA0C3857835 for ; Wed, 11 Jan 2023 14:25:34 +0000 (GMT) DKIM-Filter: OpenDKIM Filter v2.11.0 sourceware.org DCA0C3857835 DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gcc.gnu.org; s=default; t=1673447134; bh=eBwqVK2+o8OUnqWjvbASHZCLSkCyp3MqqXw3gY3Db6Q=; h=Date:Cc:Subject:To:List-Id:List-Unsubscribe:List-Archive: List-Post:List-Help:List-Subscribe:From:Reply-To:From; b=l2tdnd9fhJQDmpXy5WmiTgxBdTVdhHL1kUU0irvfisIrYHg/oXu0YbYLsrPOGNE4o wV80YuUK5JtEXLE91WE3Q4HgIrFeM/+JnD+85jFh0YXGh4F3bWxCaznua39HZCmxFZ Sw+3RnOoJpYqyQYyvnw9AKbZgE/LJVT6PNgD+gBE= X-Original-To: gcc-patches@gcc.gnu.org Delivered-To: gcc-patches@gcc.gnu.org Received: from EUR03-DBA-obe.outbound.protection.outlook.com (mail-dbaeur03hn2230.outbound.protection.outlook.com [52.100.14.230]) by sourceware.org (Postfix) with ESMTPS id B956C3858289 for ; Wed, 11 Jan 2023 14:24:31 +0000 (GMT) DMARC-Filter: OpenDMARC Filter v1.4.2 sourceware.org B956C3858289 Received: from ZR2P278CA0014.CHEP278.PROD.OUTLOOK.COM (2603:10a6:910:50::20) by AM8PR08MB6449.eurprd08.prod.outlook.com (2603:10a6:20b:364::22) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.5986.18; Wed, 11 Jan 2023 14:24:27 +0000 Received: from VI1EUR03FT060.eop-EUR03.prod.protection.outlook.com (2603:10a6:910:50:cafe::fe) by ZR2P278CA0014.outlook.office365.com (2603:10a6:910:50::20) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.6002.12 via Frontend Transport; Wed, 11 Jan 2023 14:24:26 +0000 X-MS-Exchange-Authentication-Results: spf=pass (sender IP is 63.35.35.123) smtp.mailfrom=arm.com; dkim=pass (signature was verified) header.d=armh.onmicrosoft.com;dmarc=pass action=none header.from=arm.com; Received-SPF: Pass (protection.outlook.com: domain of arm.com designates 63.35.35.123 as permitted sender) receiver=protection.outlook.com; client-ip=63.35.35.123; helo=64aa7808-outbound-1.mta.getcheckrecipient.com; pr=C Received: from 64aa7808-outbound-1.mta.getcheckrecipient.com (63.35.35.123) by VI1EUR03FT060.mail.protection.outlook.com (100.127.144.243) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.6002.12 via Frontend Transport; Wed, 11 Jan 2023 14:24:26 +0000 Received: ("Tessian outbound 333ca28169fa:v132"); Wed, 11 Jan 2023 14:24:26 +0000 X-CheckRecipientChecked: true X-CR-MTA-CID: c1915e704b925803 X-CR-MTA-TID: 64aa7808 Received: from df3812aa2ea6.1 by 64aa7808-outbound-1.mta.getcheckrecipient.com id 57E2EE6E-3AAF-4582-97B2-FAB34FF08759.1; Wed, 11 Jan 2023 14:24:14 +0000 Received: from EUR05-VI1-obe.outbound.protection.outlook.com by 64aa7808-outbound-1.mta.getcheckrecipient.com with ESMTPS id df3812aa2ea6.1 (version=TLSv1.2 cipher=ECDHE-RSA-AES256-GCM-SHA384); Wed, 11 Jan 2023 14:24:14 +0000 ARC-Seal: i=1; a=rsa-sha256; s=arcselector9901; d=microsoft.com; cv=none; b=c+Y6KGNXzT5INbzVruv0vUI20Lu3upjU8ykwu7gZgD4i8a8r5cUqaozodck+Mi20yAV/EZlJWk5WIAMKsmo3qPJ0mB9l56sYQBIUsBa97RYeS4xdEwC21tn1vDHxPdIDm/4px/rsjYNgLH9vdk4/cJkFrdRdW+hr3X+RfZvyX+wHnF+uwMKF0KlvuYP3n1B7xjEAV2kIyaX12n4NGTeXQMVoJEaup6GVdlQPBPU/l6+aK8sgB/jAc2OGkY0y/k8bNbv5ZJAog5FyGowtD9tFIhdNTyqm4aseuIZJ+cEqxAn7E4HSzocET9tC+CvnK1ottbQbYHc0HXmWKQuzch0L+Q== ARC-Message-Signature: i=1; a=rsa-sha256; c=relaxed/relaxed; d=microsoft.com; s=arcselector9901; h=From:Date:Subject:Message-ID:Content-Type:MIME-Version:X-MS-Exchange-AntiSpam-MessageData-ChunkCount:X-MS-Exchange-AntiSpam-MessageData-0:X-MS-Exchange-AntiSpam-MessageData-1; bh=eBwqVK2+o8OUnqWjvbASHZCLSkCyp3MqqXw3gY3Db6Q=; b=gQqzYK9e8uczLZDUzfRJ8F9OCh1DgGX8sRZmOwlIInK99os4Vx/Hgt6eVq0hLHKgO9vfdPMZ2S3/rag6xAengDV5bPZDwk/KGhRA7bU7zNTcrCvN0zHoV9t8EUaWCLP2uMbE6Voifs8XeWqaQggj/PC0gtYza8cZ7FXyFY6xyw7/KN+rOlr30AlPYFtT+a0oxHVcvhpFpkIQekVCENqNmMdRNIpkub6pNI28EdXkylsI9yztBGaKiiVCc/0wsdSpzJCKH8D1vLMTT4574QZcWnVIpK2xLvnXpURA0dSlSxKIFzaFWIIGtSDWTat3k1O6xfH3N8QOhnU/P71GcXD/GA== ARC-Authentication-Results: i=1; mx.microsoft.com 1; spf=pass smtp.mailfrom=arm.com; dmarc=pass action=none header.from=arm.com; dkim=pass header.d=arm.com; arc=none Authentication-Results-Original: dkim=none (message not signed) header.d=none;dmarc=none action=none header.from=arm.com; Received: from AS8PR08MB6503.eurprd08.prod.outlook.com (2603:10a6:20b:33b::18) by AS2PR08MB8805.eurprd08.prod.outlook.com (2603:10a6:20b:5f5::22) with Microsoft SMTP Server (version=TLS1_2, cipher=TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384) id 15.20.5986.18; Wed, 11 Jan 2023 14:24:11 +0000 Received: from AS8PR08MB6503.eurprd08.prod.outlook.com ([fe80::6ba0:d8b5:6133:e36b]) by AS8PR08MB6503.eurprd08.prod.outlook.com ([fe80::6ba0:d8b5:6133:e36b%2]) with mapi id 15.20.6002.012; Wed, 11 Jan 2023 14:24:11 +0000 Message-ID: <93eea5fd-25c8-dc11-c49f-4c36bd84eb14@arm.com> Date: Wed, 11 Jan 2023 14:23:57 +0000 User-Agent: Mozilla/5.0 (X11; Linux x86_64; rv:102.0) Gecko/20100101 Thunderbird/102.6.0 Content-Language: en-US Cc: Kyrylo Tkachov , "Andre Vieira (lists)" , Richard Earnshaw , "nickc@redhat.com" , ramana.gcc@gmail.com Subject: [PATCH 1/2 v2] arm: Add define_attr to to create a mapping between MVE predicated and unpredicated insns To: "gcc-patches@gcc.gnu.org" X-ClientProxiedBy: LO6P265CA0029.GBRP265.PROD.OUTLOOK.COM (2603:10a6:600:2ff::15) To AS8PR08MB6503.eurprd08.prod.outlook.com (2603:10a6:20b:33b::18) MIME-Version: 1.0 X-MS-TrafficTypeDiagnostic: AS8PR08MB6503:EE_|AS2PR08MB8805:EE_|VI1EUR03FT060:EE_|AM8PR08MB6449:EE_ X-MS-Office365-Filtering-Correlation-Id: 4e48ecb5-6269-489c-e7b4-08daf3df8b55 X-LD-Processed: f34e5979-57d9-4aaa-ad4d-b122a662184d,ExtAddr x-checkrecipientrouted: true NoDisclaimer: true X-MS-Exchange-SenderADCheck: 1 X-MS-Exchange-AntiSpam-Relay: 0 X-Microsoft-Antispam-Untrusted: BCL:0; X-Microsoft-Antispam-Message-Info-Original: CNjEUL3cd19wF8AERsQ9lceB3B9u+IH/E0R4HZQ9EpAquQJAd8YYCaB84S0NkZAF76j2dt68nfUmUwq0zQQQomTkL/f1S1FfSudgdfqOr+8Es9FWLibgrj9j4VV0EsiO5FcDIfxiBdcr3j/L5zcwjRBGkvYxWXwliNFsJPx0zV/Px4wY3bJAGlllksSgYtGGEWYx/AXsdvRZTA6JSQ3IVo/SpZeeivRrBhdQH1oSvK4LUXqxbQkV7BobPyMdJPbR3+xmA6ie0vqsm95Y6QBoT8rfJy5bRC+4aCSMYD+NKlgQce9Dx0hafpS0Y0vpoom2+7hGbRgOiVPHfAZZRtTsrukEWRe288nil80cXFwT431NFKxbjiAp+jcSWfy6v88blJp9K/Lh47M9gPWsVJ0lNm9IQ64MCuGZa0Oo/vgYo2Ps7AfEz33zcDA4+Vl/nY2Bjdu3tNmzFI/flfTspvrXl8LJQjZlVH2PPEVhVZwBjwUcnMmGyAMpGdWKBuJqvy+/D+zwjFHNLRyhMPn//uZlTV5d9I5ToEShXVUlM3Ln6dFmFLlJerGZWYtCYz8wuujiE8JeYUXKQAFk5xwzPExSjPDorsv9GaCwPiTkdtMuYt+6QazwSeEBxJPo81uId2ae8B2IkZoxdEfIr3ueSB6VC1kj/HAwnpQtICYX/xx9LGVu5a/8wEfI5mMhKJxTmCR/ClRKBG6TD0AbjccD7tL3r6gg+e5+1mMh8jDnolcymeNNx8jyP4RSEBUwZC9K3ET+uqPzyYLsH9EphD7LNhKePo8xYnW4LeXfCOemVXs/Q6KtJqnx2oQmBHHW9oDARIJr3rSL+8MKheBrASIOLTwMX80mbjW99JMU1z0RgyyukSsJK0iHLzs/fw5o9mlnZvMfuC4OGWkF/ehw3xAuDgDLR38suhTPZebt8cojLibMFUDrfvAKaaPKFLBCVBTZ9wj7IQLPgmZLEvG61Ecm4N9/2eedfmrStL2sSRW7pgZKSpbx40YIgGzPKtkuRSOw4CZeALUrdcAjEj67VV20j8KIaS1K9k/ETZgWC2CRT5lPS4P+1CFwxM65Yzw/fLHLWxwczX6IgZxSSRK+wKppdeTwxf6tMtUcHSxGQdgHgjNeUmSvZ74HdbZrinQeQxgRYiCokkoknMseSELpxPmi7wxN+6Bly0jzhD6f7ujbk0vGMaP677x0kvfci+7fTmSwnD1uZSmp32XBqtGcHPAZaJdr6/zq4bwuYuSuZx2qakx2j/U= X-Forefront-Antispam-Report-Untrusted: CIP:255.255.255.255; CTRY:; LANG:en; SCL:5; SRV:; IPV:NLI; SFV:SPM; H:AS8PR08MB6503.eurprd08.prod.outlook.com; PTR:; CAT:OSPM; SFS:(13230022)(4636009)(346002)(376002)(366004)(396003)(39860400002)(136003)(451199015)(2906002)(30864003)(31686004)(84970400001)(54906003)(2616005)(5660300002)(478600001)(235185007)(41300700001)(966005)(8936002)(36756003)(66476007)(4326008)(66946007)(8676002)(66556008)(83380400001)(31696002)(86362001)(316002)(6486002)(6916009)(38100700002)(186003)(26005)(6512007)(6506007)(33964004)(6666004)(43740500002)(45980500001)(11215385002); DIR:OUT; SFP:1501; X-MS-Exchange-Transport-CrossTenantHeadersStamped: AS2PR08MB8805 Original-Authentication-Results: dkim=none (message not signed) header.d=none;dmarc=none action=none header.from=arm.com; X-EOPAttributedMessage: 0 X-MS-Exchange-Transport-CrossTenantHeadersStripped: VI1EUR03FT060.eop-EUR03.prod.protection.outlook.com X-MS-PublicTrafficType: Email X-MS-Office365-Filtering-Correlation-Id-Prvs: 90a41a9d-ad0e-4b54-3b6e-08daf3df81a0 X-Microsoft-Antispam: BCL:0; X-Microsoft-Antispam-Message-Info: =?utf-8?q?09otx/pcwDw7kyWyjWYO/rTay6oNi2N?= =?utf-8?q?ssBXcZeii7a1MIwatOnKJA6imH0w6+1EEjwrmNXO1FRFPa5MopcdoQMh4HwNX+IFG?= =?utf-8?q?snvERO2DomtEKF/F792hdd6cEp28kUaVJqWCwOEIsTlwdqBCLnNzV44e5tNgph/VI?= =?utf-8?q?2Ok2zRCOln7oTpOni6hHIlsZ+FtjvlkzvdFkZURNLewZtBR9Fp0tuinGPTCudxdtW?= =?utf-8?q?KQzSBzy6lLAo84ByLOx4DHB2c2zd7PnBVUGV3/9XZcu+MVvpz1LcPrO1qBu3uE/bE?= =?utf-8?q?+CBCDvZSpPfjXkdiF0WYDJ0wC4gPWNtcBDfvmhmNe0IPfrjgGWJKoJs5nuSeq9GmP?= =?utf-8?q?G2qEZWLQh/RRDiHdtE1RAI/N0Ym16ndXkS4/YW0qlpH1cc9wB1p9w6rV8ginDfsl8?= =?utf-8?q?Qo4HnjQBZgJ9kA1CP8T1DnG6n26ShfGj715j/Me6sdgrpt35ckUMtBZ6UpInckZVt?= =?utf-8?q?yuV+yKKCOCXmslLn7n6lpWph8q1pmzCNg5rD1cPAE+pw5zu5egf9SFXLGkgqa7Xtx?= =?utf-8?q?53ZrXIdhX+gql04FbZPM7UGsTUjH4lLbsZHusqHXzCqfdvr16IApcZayVrHGxBeJd?= =?utf-8?q?dhO0x03qrPmMLLx3NuDusH6rvvcyZKMyIvNUq5g0lDoRHZ/t6Vd0bmYUobFBjS56i?= =?utf-8?q?0eHyaYhPQCweialHlo4MxLoxOC2awvUCIKQgmxU+cxHtqgIj0UpOe1xDY3EwvmyMi?= =?utf-8?q?+v/W8g5fZn7W1Qx0Gi2rjYzkLIxvoMm0d174WWP7IAC/2gIf+CUAlDviK8FaO57T8?= =?utf-8?q?POFM0cuvzdU9M1wzesFmzfWjERPsH3nngNYeSK+RoWkwnhVFqzGKseSYLCgLvrV5X?= =?utf-8?q?o+K6qWeDV4oeI+d5p4cWiQC8c/0reoUORiv4CyMRSJcjx6VO30qelGiPJj+yD6Dgn?= =?utf-8?q?ZsNyWGpuCE43oXPGPxw8XfVBBa2nZbs2+vHVLngDMAjaZGjK/GJ5HbLM9GhNY4wHH?= =?utf-8?q?ZK/Feah6c/2h9tvSInP+piXo57Q/DH18XE6md6BuQBWVpNo95FNUSORK08trB71WQ?= =?utf-8?q?hMtGCAd7C6ggN5Y6tEz/7EIOUJDOiwoopUbDSbnI6dN0WUdlC0U8yS7NUQ6ifloF7?= =?utf-8?q?dAg0PyRE6ZT1y8CiSlh3X90uPgD4bpJ5HJVv27n2nvQMHM2ZyNhifLyUnIgcHx2fN?= =?utf-8?q?W6CIn+Phn98MxAev+t22aYOmwgfNqs35ko8tZLzZXGh9pb8tyNNTBH800Y=3D?= X-Forefront-Antispam-Report: CIP:63.35.35.123; CTRY:IE; LANG:en; SCL:5; SRV:; IPV:CAL; SFV:SPM; H:64aa7808-outbound-1.mta.getcheckrecipient.com; PTR:ec2-63-35-35-123.eu-west-1.compute.amazonaws.com; CAT:OSPM; SFS:(13230022)(4636009)(376002)(39860400002)(136003)(346002)(396003)(451199015)(46966006)(36840700001)(40470700004)(2906002)(31686004)(84970400001)(36860700001)(5660300002)(2616005)(966005)(41300700001)(36756003)(30864003)(8936002)(70586007)(70206006)(83380400001)(8676002)(356005)(4326008)(31696002)(47076005)(86362001)(496002)(316002)(40480700001)(336012)(82310400005)(40460700003)(81166007)(186003)(478600001)(235185007)(6666004)(6512007)(26005)(33964004)(6506007)(107886003)(6916009)(54906003)(82740400003)(6486002)(43740500002)(11215385002); DIR:OUT; SFP:1501; X-OriginatorOrg: arm.com X-MS-Exchange-CrossTenant-OriginalArrivalTime: 11 Jan 2023 14:24:26.5703 (UTC) X-MS-Exchange-CrossTenant-Network-Message-Id: 4e48ecb5-6269-489c-e7b4-08daf3df8b55 X-MS-Exchange-CrossTenant-Id: f34e5979-57d9-4aaa-ad4d-b122a662184d X-MS-Exchange-CrossTenant-OriginalAttributedTenantConnectingIp: TenantId=f34e5979-57d9-4aaa-ad4d-b122a662184d; Ip=[63.35.35.123]; Helo=[64aa7808-outbound-1.mta.getcheckrecipient.com] X-MS-Exchange-CrossTenant-AuthSource: VI1EUR03FT060.eop-EUR03.prod.protection.outlook.com X-MS-Exchange-CrossTenant-AuthAs: Anonymous X-MS-Exchange-CrossTenant-FromEntityHeader: HybridOnPrem X-MS-Exchange-Transport-CrossTenantHeadersStamped: AM8PR08MB6449 X-Spam-Status: No, score=-11.7 required=5.0 tests=BAYES_00, BODY_8BITS, DKIM_SIGNED, DKIM_VALID, GIT_PATCH_0, KAM_DMARC_NONE, KAM_LOTSOFHASH, KAM_SHORT, RCVD_IN_DNSWL_NONE, RCVD_IN_MSPIKE_H2, SPF_HELO_NONE, SPF_NONE, TXREP, UNPARSEABLE_RELAY autolearn=ham autolearn_force=no version=3.4.6 X-Spam-Checker-Version: SpamAssassin 3.4.6 (2021-04-09) on server2.sourceware.org X-BeenThere: gcc-patches@gcc.gnu.org X-Mailman-Version: 2.1.29 Precedence: list List-Id: Gcc-patches mailing list List-Unsubscribe: , List-Archive: List-Post: List-Help: List-Subscribe: , X-Patchwork-Original-From: Stam Markianos-Wright via Gcc-patches From: Stamatis Markianos-Wright Reply-To: Stam Markianos-Wright Errors-To: gcc-patches-bounces+ouuuleilei=gmail.com@gcc.gnu.org Sender: "Gcc-patches" X-getmail-retrieved-from-mailbox: =?utf-8?q?INBOX?= X-GMAIL-THRID: =?utf-8?q?1754736522405788246?= X-GMAIL-MSGID: =?utf-8?q?1754736522405788246?= -----  Respin of the below patch ----- In this 1/2 patch, from v1 to v2 I have added: * The three new helper #defines in arm.h. * Attribute mappings to unpredicated MVE instructions that map to themselves. This allows us to distinguish between unpredicated insns that do have a VPT predicated form (are VPT predicable) and insns that do not. Original email with updated Changelog at the end: Hi all, I'd like to submit two patches that add support for Arm's MVE Tail Predicated Low Overhead Loop feature. --- Introduction --- The M-class Arm-ARM: https://developer.arm.com/documentation/ddi0553/bu/?lang=en Section B5.5.1 "Loop tail predication" describes the feature we are adding support for with this patch (although we only add codegen for DLSTP/LETP instruction loops). Previously with commit d2ed233cb94 we'd added support for non-MVE DLS/LE loops through the loop-doloop pass, which, given a standard MVE loop like: ``` void  __attribute__ ((noinline)) test (int16_t *a, int16_t *b, int16_t *c, int n) {   while (n > 0)     {       mve_pred16_t p = vctp16q (n);       int16x8_t va = vldrhq_z_s16 (a, p);       int16x8_t vb = vldrhq_z_s16 (b, p);       int16x8_t vc = vaddq_x_s16 (va, vb, p);       vstrhq_p_s16 (c, vc, p);       c+=8;       a+=8;       b+=8;       n-=8;     } } ``` .. would output: ```                 dls     lr, lr .L3:         vctp.16 r3         vmrs    ip, P0  @ movhi         sxth    ip, ip         vmsr     P0, ip @ movhi         mov     r4, r0         vpst         vldrht.16       q2, [r4]         mov     r4, r1         vmov    q3, q0         vpst         vldrht.16       q1, [r4]         mov     r4, r2         vpst         vaddt.i16       q3, q2, q1         subs    r3, r3, #8         vpst         vstrht.16       q3, [r4]         adds    r0, r0, #16         adds    r1, r1, #16         adds    r2, r2, #16         le      lr, .L3 ``` where the LE instruction will decrement LR by 1, compare and branch if needed. (there are also other inefficiencies with the above code, like the pointless vmrs/sxth/vmsr on the VPR and the adds not being merged into the vldrht/vstrht as a #16 offsets and some random movs! But that's different problems...) The MVE version is similar, except that: * Instead of DLS/LE the instructions are DLSTP/LETP. * Instead of pre-calculating the number of iterations of the   loop, we place the number of elements to be processed by the   loop into LR. * Instead of decrementing the LR by one, LETP will decrement it   by FPSCR.LTPSIZE, which is the number of elements being   processed in each iteration: 16 for 8-bit elements, 5 for 16-bit   elements, etc. * On the final iteration, automatic Loop Tail Predication is   performed, as if the instructions within the loop had been VPT   predicated with a VCTP generating the VPR predicate in every   loop iteration. The dlstp/letp loop now looks like: ```                 dlstp.16        lr, r3 .L14:         mov     r3, r0         vldrh.16        q3, [r3]         mov     r3, r1         vldrh.16        q2, [r3]         mov     r3, r2         vadd.i16  q3, q3, q2         adds    r0, r0, #16         vstrh.16        q3, [r3]         adds    r1, r1, #16         adds    r2, r2, #16         letp    lr, .L14 ``` Since the loop tail predication is automatic, we have eliminated the VCTP that had been specified by the user in the intrinsic and converted the VPT-predicated instructions into their unpredicated equivalents (which also saves us from VPST insns). The LE instruction here decrements LR by 8 in each iteration. --- This 1/2 patch --- This first patch lays some groundwork by adding an attribute to md patterns, and then the second patch contains the functional changes. One major difficulty in implementing MVE Tail-Predicated Low Overhead Loops was the need to transform VPT-predicated insns in the insn chain into their unpredicated equivalents, like: `mve_vldrbq_z_ -> mve_vldrbq_`. This requires us to have a deterministic link between two different patterns in mve.md -- this _could_ be done by re-ordering the entirety of mve.md such that the patterns are at some constant icode proximity (e.g. having the _z immediately after the unpredicated version would mean that to map from the former to the latter you could use icode-1), but that is a very messy solution that would lead to complex unknown dependencies between patterns. This patch proves an alternative way of doing that: using an insn attribute to encode the icode of the unpredicated instruction. This was implemented by doing a find n replace across mve.md using the following patterns: define_insn "(.*)_p_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_p_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr define_insn "(.*)_m_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_m_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr define_insn "(.*)_z_(.*)"((.|\n)*?)\n( )*\[\(set_attr define_insn "$1_z_$2"$3\n$5[(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_$1_$2"))\n$5 (set_attr and then a number of manual fixes were needed for the md patterns that did not conform to the above.  Those changes were: Dropped the type suffix _s/_u_f: CODE_FOR_mve_vcmpcsq_n_ CODE_FOR_mve_vcmpcsq_ CODE_FOR_mve_vcmpeqq_n_ CODE_FOR_mve_vcmpeqq_ CODE_FOR_mve_vcmpgeq_n_ CODE_FOR_mve_vcmpgeq_ CODE_FOR_mve_vcmpgtq_n_ CODE_FOR_mve_vcmpgtq_ CODE_FOR_mve_vcmphiq_n_ CODE_FOR_mve_vcmphiq_ CODE_FOR_mve_vcmpleq_n_ CODE_FOR_mve_vcmpleq_ CODE_FOR_mve_vcmpltq_n_ CODE_FOR_mve_vcmpltq_ CODE_FOR_mve_vcmpneq_n_ CODE_FOR_mve_vcmpneq_ CODE_FOR_mve_vaddq CODE_FOR_mve_vcaddq_rot270 CODE_FOR_mve_vcaddq_rot90 CODE_FOR_mve_vcaddq_rot270 CODE_FOR_mve_vcaddq_rot90 CODE_FOR_mve_vcmlaq CODE_FOR_mve_vcmlaq_rot180 CODE_FOR_mve_vcmlaq_rot270 CODE_FOR_mve_vcmlaq_rot90 CODE_FOR_mve_vcmulq CODE_FOR_mve_vcmulq_rot180 CODE_FOR_mve_vcmulq_rot270 CODE_FOR_mve_vcmulq_rot90 Dropped _wb_: CODE_FOR_mve_vidupq_u_insn CODE_FOR_mve_vddupq_u_insn Dropped one underscore character: CODE_FOR_arm_vcx1qv16qi CODE_FOR_arm_vcx2qv16qi CODE_FOR_arm_vcx3qv16qi No regressions on arm-none-eabi with an MVE target. Thank you, Stam Markianos-Wright gcc/ChangeLog:         * config/arm/arm.md (mve_unpredicated_insn): New attribute.         * config/arm/arm.h (MVE_VPT_PREDICATED_INSN_P): New define.     (MVE_VPT_UNPREDICATED_INSN_P): Likewise.     (MVE_VPT_PREDICABLE_INSN_P): Likewise.         * config/arm/vec-common.md (mve_vshlq_): Add attribute.         * config/arm/mve.md (arm_vcx1q_p_v16qi): Add attribute.     (arm_vcx1qv16qi): Likewise.     (arm_vcx1qav16qi): Likewise.     (arm_vcx1qv16qi): Likewise.     (arm_vcx2q_p_v16qi): Likewise.     (arm_vcx2qv16qi): Likewise.     (arm_vcx2qav16qi): Likewise.     (arm_vcx2qv16qi): Likewise.     (arm_vcx3q_p_v16qi): Likewise.     (arm_vcx3qv16qi): Likewise.     (arm_vcx3qav16qi: Likewise.     (arm_vcx3qv16qi): Likewise.     (mve_vabavq_): Likewise.     (mve_vabavq_p_): Likewise.     (mve_vabdq_): Likewise.     (mve_vabdq_f): Likewise.     (mve_vabdq_m_): Likewise.     (mve_vabdq_m_f): Likewise.     (mve_vabsq_f): Likewise.     (mve_vabsq_m_f): Likewise.     (mve_vabsq_m_s): Likewise.     (mve_vabsq_s): Likewise.     (mve_vadciq_v4si): Likewise.     (mve_vadciq_m_v4si): Likewise.     (mve_vadcq_v4si): Likewise.     (mve_vadcq_m_v4si): Likewise.     (mve_vaddlvaq_v4si): Likewise.     (mve_vaddlvaq_p_v4si): Likewise.     (mve_vaddlvq_v4si): Likewise.     (mve_vaddlvq_p_v4si): Likewise.     (mve_vaddq_f): Likewise.     (mve_vaddq_m_): Likewise.     (mve_vaddq_m_f): Likewise.     (mve_vaddq_m_n_): Likewise.     (mve_vaddq_m_n_f): Likewise.     (mve_vaddq_n_): Likewise.     (mve_vaddq_n_f): Likewise.     (mve_vaddq): Likewise.     (mve_vaddvaq_): Likewise.     (mve_vaddvaq_p_): Likewise.     (mve_vaddvq_): Likewise.     (mve_vaddvq_p_): Likewise.     (mve_vandq_): Likewise.     (mve_vandq_f): Likewise.     (mve_vandq_m_): Likewise.     (mve_vandq_m_f): Likewise.     (mve_vandq_s): Likewise.     (mve_vandq_u): Likewise.     (mve_vbicq_): Likewise.     (mve_vbicq_f): Likewise.     (mve_vbicq_m_): Likewise.     (mve_vbicq_m_f): Likewise.     (mve_vbicq_m_n_): Likewise.     (mve_vbicq_n_): Likewise.     (mve_vbicq_s): Likewise.     (mve_vbicq_u): Likewise.     (mve_vbrsrq_m_n_): Likewise.     (mve_vbrsrq_m_n_f): Likewise.     (mve_vbrsrq_n_): Likewise.     (mve_vbrsrq_n_f): Likewise.     (mve_vcaddq_rot270_m_): Likewise.     (mve_vcaddq_rot270_m_f): Likewise.     (mve_vcaddq_rot270): Likewise.     (mve_vcaddq_rot270): Likewise.     (mve_vcaddq_rot90_m_): Likewise.     (mve_vcaddq_rot90_m_f): Likewise.     (mve_vcaddq_rot90): Likewise.     (mve_vcaddq_rot90): Likewise.     (mve_vcaddq): Likewise.     (mve_vcaddq): Likewise.     (mve_vclsq_m_s): Likewise.     (mve_vclsq_s): Likewise.     (mve_vclzq_): Likewise.     (mve_vclzq_m_): Likewise.     (mve_vclzq_s): Likewise.     (mve_vclzq_u): Likewise.     (mve_vcmlaq_m_f): Likewise.     (mve_vcmlaq_rot180_m_f): Likewise.     (mve_vcmlaq_rot180): Likewise.     (mve_vcmlaq_rot270_m_f): Likewise.     (mve_vcmlaq_rot270): Likewise.     (mve_vcmlaq_rot90_m_f): Likewise.     (mve_vcmlaq_rot90): Likewise.     (mve_vcmlaq): Likewise.     (mve_vcmlaq): Likewise.     (mve_vcmpq_): Likewise.     (mve_vcmpq_f): Likewise.     (mve_vcmpq_n_): Likewise.     (mve_vcmpq_n_f): Likewise.     (mve_vcmpcsq_): Likewise.     (mve_vcmpcsq_m_n_u): Likewise.     (mve_vcmpcsq_m_u): Likewise.     (mve_vcmpcsq_n_): Likewise.     (mve_vcmpeqq_): Likewise.     (mve_vcmpeqq_f): Likewise.     (mve_vcmpeqq_m_): Likewise.     (mve_vcmpeqq_m_f): Likewise.     (mve_vcmpeqq_m_n_): Likewise.     (mve_vcmpeqq_m_n_f): Likewise.     (mve_vcmpeqq_n_): Likewise.     (mve_vcmpeqq_n_f): Likewise.     (mve_vcmpgeq_): Likewise.     (mve_vcmpgeq_f): Likewise.     (mve_vcmpgeq_m_f): Likewise.     (mve_vcmpgeq_m_n_f): Likewise.     (mve_vcmpgeq_m_n_s): Likewise.     (mve_vcmpgeq_m_s): Likewise.     (mve_vcmpgeq_n_): Likewise.     (mve_vcmpgeq_n_f): Likewise.     (mve_vcmpgtq_): Likewise.     (mve_vcmpgtq_f): Likewise.     (mve_vcmpgtq_m_f): Likewise.     (mve_vcmpgtq_m_n_f): Likewise.     (mve_vcmpgtq_m_n_s): Likewise.     (mve_vcmpgtq_m_s): Likewise.     (mve_vcmpgtq_n_): Likewise.     (mve_vcmpgtq_n_f): Likewise.     (mve_vcmphiq_): Likewise.     (mve_vcmphiq_m_n_u): Likewise.     (mve_vcmphiq_m_u): Likewise.     (mve_vcmphiq_n_): Likewise.     (mve_vcmpleq_): Likewise.     (mve_vcmpleq_f): Likewise.     (mve_vcmpleq_m_f): Likewise.     (mve_vcmpleq_m_n_f): Likewise.     (mve_vcmpleq_m_n_s): Likewise.     (mve_vcmpleq_m_s): Likewise.     (mve_vcmpleq_n_): Likewise.     (mve_vcmpleq_n_f): Likewise.     (mve_vcmpltq_): Likewise.     (mve_vcmpltq_f): Likewise.     (mve_vcmpltq_m_f): Likewise.     (mve_vcmpltq_m_n_f): Likewise.     (mve_vcmpltq_m_n_s): Likewise.     (mve_vcmpltq_m_s): Likewise.     (mve_vcmpltq_n_): Likewise.     (mve_vcmpltq_n_f): Likewise.     (mve_vcmpneq_): Likewise.     (mve_vcmpneq_f): Likewise.     (mve_vcmpneq_m_): Likewise.     (mve_vcmpneq_m_f): Likewise.     (mve_vcmpneq_m_n_): Likewise.     (mve_vcmpneq_m_n_f): Likewise.     (mve_vcmpneq_n_): Likewise.     (mve_vcmpneq_n_f): Likewise.     (mve_vcmulq_m_f): Likewise.     (mve_vcmulq_rot180_m_f): Likewise.     (mve_vcmulq_rot180): Likewise.     (mve_vcmulq_rot270_m_f): Likewise.     (mve_vcmulq_rot270): Likewise.     (mve_vcmulq_rot90_m_f): Likewise.     (mve_vcmulq_rot90): Likewise.     (mve_vcmulq): Likewise.     (mve_vcmulq): Likewise.     (mve_vctpq_mhi): Likewise.     (mve_vctpqhi): Likewise.     (mve_vcvtaq_): Likewise.     (mve_vcvtaq_m_): Likewise.     (mve_vcvtbq_f16_f32v8hf): Likewise.     (mve_vcvtbq_f32_f16v4sf): Likewise.     (mve_vcvtbq_m_f16_f32v8hf): Likewise.     (mve_vcvtbq_m_f32_f16v4sf): Likewise.     (mve_vcvtmq_): Likewise.     (mve_vcvtmq_m_): Likewise.     (mve_vcvtnq_): Likewise.     (mve_vcvtnq_m_): Likewise.     (mve_vcvtpq_): Likewise.     (mve_vcvtpq_m_): Likewise.     (mve_vcvtq_from_f_): Likewise.     (mve_vcvtq_m_from_f_): Likewise.     (mve_vcvtq_m_n_from_f_): Likewise.     (mve_vcvtq_m_n_to_f_): Likewise.     (mve_vcvtq_m_to_f_): Likewise.     (mve_vcvtq_n_from_f_): Likewise.     (mve_vcvtq_n_to_f_): Likewise.     (mve_vcvtq_to_f_): Likewise.     (mve_vcvttq_f16_f32v8hf): Likewise.     (mve_vcvttq_f32_f16v4sf): Likewise.     (mve_vcvttq_m_f16_f32v8hf): Likewise.     (mve_vcvttq_m_f32_f16v4sf): Likewise.     (mve_vddupq_m_wb_u_insn): Likewise.     (mve_vddupq_u_insn): Likewise.     (mve_vdupq_m_n_): Likewise.     (mve_vdupq_m_n_f): Likewise.     (mve_vdupq_n_): Likewise.     (mve_vdupq_n_f): Likewise.     (mve_vdwdupq_m_wb_u_insn): Likewise.     (mve_vdwdupq_wb_u_insn): Likewise.     (mve_veorq_): Likewise.     (mve_veorq_f): Likewise.     (mve_veorq_m_): Likewise.     (mve_veorq_m_f): Likewise.     (mve_veorq_s): Likewise.     (mve_veorq_u): Likewise.     (mve_vfmaq_f): Likewise.     (mve_vfmaq_m_f): Likewise.     (mve_vfmaq_m_n_f): Likewise.     (mve_vfmaq_n_f): Likewise.     (mve_vfmasq_m_n_f): Likewise.     (mve_vfmasq_n_f): Likewise.     (mve_vfmsq_f): Likewise.     (mve_vfmsq_m_f): Likewise.     (mve_vhaddq_): Likewise.     (mve_vhaddq_m_): Likewise.     (mve_vhaddq_m_n_): Likewise.     (mve_vhaddq_n_): Likewise.     (mve_vhcaddq_rot270_m_s): Likewise.     (mve_vhcaddq_rot270_s): Likewise.     (mve_vhcaddq_rot90_m_s): Likewise.     (mve_vhcaddq_rot90_s): Likewise.     (mve_vhsubq_): Likewise.     (mve_vhsubq_m_): Likewise.     (mve_vhsubq_m_n_): Likewise.     (mve_vhsubq_n_): Likewise.     (mve_vidupq_m_wb_u_insn): Likewise.     (mve_vidupq_u_insn): Likewise.     (mve_viwdupq_m_wb_u_insn): Likewise.     (mve_viwdupq_wb_u_insn): Likewise.     (mve_vldrbq_): Likewise.     (mve_vldrbq_gather_offset_): Likewise.     (mve_vldrbq_gather_offset_z_): Likewise.     (mve_vldrbq_z_): Likewise.     (mve_vldrdq_gather_base_v2di): Likewise.     (mve_vldrdq_gather_base_wb_v2di_insn): Likewise.     (mve_vldrdq_gather_base_wb_z_v2di_insn): Likewise.     (mve_vldrdq_gather_base_z_v2di): Likewise.     (mve_vldrdq_gather_offset_v2di): Likewise.     (mve_vldrdq_gather_offset_z_v2di): Likewise.     (mve_vldrdq_gather_shifted_offset_v2di): Likewise.     (mve_vldrdq_gather_shifted_offset_z_v2di): Likewise.     (mve_vldrhq_): Likewise.     (mve_vldrhq_fv8hf): Likewise.     (mve_vldrhq_gather_offset_): Likewise.     (mve_vldrhq_gather_offset_fv8hf): Likewise.     (mve_vldrhq_gather_offset_z_): Likewise.     (mve_vldrhq_gather_offset_z_fv8hf): Likewise.     (mve_vldrhq_gather_shifted_offset_): Likewise.     (mve_vldrhq_gather_shifted_offset_fv8hf): Likewise.     (mve_vldrhq_gather_shifted_offset_z_): Likewise.     (mve_vldrhq_gather_shifted_offset_z_fv8hf): Likewise.     (mve_vldrhq_z_): Likewise.     (mve_vldrhq_z_fv8hf): Likewise.     (mve_vldrwq_v4si): Likewise.     (mve_vldrwq_fv4sf): Likewise.     (mve_vldrwq_gather_base_v4si): Likewise.     (mve_vldrwq_gather_base_fv4sf): Likewise.     (mve_vldrwq_gather_base_wb_v4si_insn): Likewise.     (mve_vldrwq_gather_base_wb_fv4sf_insn): Likewise.     (mve_vldrwq_gather_base_wb_z_v4si_insn): Likewise.     (mve_vldrwq_gather_base_wb_z_fv4sf_insn): Likewise.     (mve_vldrwq_gather_base_z_v4si): Likewise.     (mve_vldrwq_gather_base_z_fv4sf): Likewise.     (mve_vldrwq_gather_offset_v4si): Likewise.     (mve_vldrwq_gather_offset_fv4sf): Likewise.     (mve_vldrwq_gather_offset_z_v4si): Likewise.     (mve_vldrwq_gather_offset_z_fv4sf): Likewise.     (mve_vldrwq_gather_shifted_offset_v4si): Likewise.     (mve_vldrwq_gather_shifted_offset_fv4sf): Likewise.     (mve_vldrwq_gather_shifted_offset_z_v4si): Likewise.     (mve_vldrwq_gather_shifted_offset_z_fv4sf): Likewise.     (mve_vldrwq_z_v4si): Likewise.     (mve_vldrwq_z_fv4sf): Likewise.     (mve_vmaxaq_m_s): Likewise.     (mve_vmaxaq_s): Likewise.     (mve_vmaxavq_p_s): Likewise.     (mve_vmaxavq_s): Likewise.     (mve_vmaxnmaq_f): Likewise.     (mve_vmaxnmaq_m_f): Likewise.     (mve_vmaxnmavq_f): Likewise.     (mve_vmaxnmavq_p_f): Likewise.     (mve_vmaxnmq_f): Likewise.     (mve_vmaxnmq_m_f): Likewise.     (mve_vmaxnmvq_f): Likewise.     (mve_vmaxnmvq_p_f): Likewise.     (mve_vmaxq_): Likewise.     (mve_vmaxq_m_): Likewise.     (mve_vmaxq_s): Likewise.     (mve_vmaxq_u): Likewise.     (mve_vmaxvq_): Likewise.     (mve_vmaxvq_p_): Likewise.     (mve_vminaq_m_s): Likewise.     (mve_vminaq_s): Likewise.     (mve_vminavq_p_s): Likewise.     (mve_vminavq_s): Likewise.     (mve_vminnmaq_f): Likewise.     (mve_vminnmaq_m_f): Likewise.     (mve_vminnmavq_f): Likewise.     (mve_vminnmavq_p_f): Likewise.     (mve_vminnmq_f): Likewise.     (mve_vminnmq_m_f): Likewise.     (mve_vminnmvq_f): Likewise.     (mve_vminnmvq_p_f): Likewise.     (mve_vminq_): Likewise.     (mve_vminq_m_): Likewise.     (mve_vminq_s): Likewise.     (mve_vminq_u): Likewise.     (mve_vminvq_): Likewise.     (mve_vminvq_p_): Likewise.     (mve_vmladavaq_): Likewise.     (mve_vmladavaq_p_): Likewise.     (mve_vmladavaxq_p_s): Likewise.     (mve_vmladavaxq_s): Likewise.     (mve_vmladavq_): Likewise.     (mve_vmladavq_p_): Likewise.     (mve_vmladavxq_p_s): Likewise.     (mve_vmladavxq_s): Likewise.     (mve_vmlaldavaq_): Likewise.     (mve_vmlaldavaq_p_): Likewise.     (mve_vmlaldavaxq_): Likewise.     (mve_vmlaldavaxq_p_): Likewise.     (mve_vmlaldavaxq_s): Likewise.     (mve_vmlaldavq_): Likewise.     (mve_vmlaldavq_p_): Likewise.     (mve_vmlaldavxq_p_s): Likewise.     (mve_vmlaldavxq_s): Likewise.     (mve_vmlaq_m_n_): Likewise.     (mve_vmlaq_n_): Likewise.     (mve_vmlasq_m_n_): Likewise.     (mve_vmlasq_n_): Likewise.     (mve_vmlsdavaq_p_s): Likewise.     (mve_vmlsdavaq_s): Likewise.     (mve_vmlsdavaxq_p_s): Likewise.     (mve_vmlsdavaxq_s): Likewise.     (mve_vmlsdavq_p_s): Likewise.     (mve_vmlsdavq_s): Likewise.     (mve_vmlsdavxq_p_s): Likewise.     (mve_vmlsdavxq_s): Likewise.     (mve_vmlsldavaq_p_s): Likewise.     (mve_vmlsldavaq_s): Likewise.     (mve_vmlsldavaxq_p_s): Likewise.     (mve_vmlsldavaxq_s): Likewise.     (mve_vmlsldavq_p_s): Likewise.     (mve_vmlsldavq_s): Likewise.     (mve_vmlsldavxq_p_s): Likewise.     (mve_vmlsldavxq_s): Likewise.     (mve_vmovlbq_): Likewise.     (mve_vmovlbq_m_): Likewise.     (mve_vmovltq_): Likewise.     (mve_vmovltq_m_): Likewise.     (mve_vmovnbq_): Likewise.     (mve_vmovnbq_m_): Likewise.     (mve_vmovntq_): Likewise.     (mve_vmovntq_m_): Likewise.     (mve_vmulhq_): Likewise.     (mve_vmulhq_m_): Likewise.     (mve_vmullbq_int_): Likewise.     (mve_vmullbq_int_m_): Likewise.     (mve_vmullbq_poly_m_p): Likewise.     (mve_vmullbq_poly_p): Likewise.     (mve_vmulltq_int_): Likewise.     (mve_vmulltq_int_m_): Likewise.     (mve_vmulltq_poly_m_p): Likewise.     (mve_vmulltq_poly_p): Likewise.     (mve_vmulq_): Likewise.     (mve_vmulq_f): Likewise.     (mve_vmulq_m_): Likewise.     (mve_vmulq_m_f): Likewise.     (mve_vmulq_m_n_): Likewise.     (mve_vmulq_m_n_f): Likewise.     (mve_vmulq_n_): Likewise.     (mve_vmulq_n_f): Likewise.     (mve_vmvnq_): Likewise.     (mve_vmvnq_m_): Likewise.     (mve_vmvnq_m_n_): Likewise.     (mve_vmvnq_n_): Likewise.     (mve_vmvnq_s): Likewise.     (mve_vmvnq_u): Likewise.     (mve_vnegq_f): Likewise.     (mve_vnegq_m_f): Likewise.     (mve_vnegq_m_s): Likewise.     (mve_vnegq_s): Likewise.     (mve_vornq_): Likewise.     (mve_vornq_f): Likewise.     (mve_vornq_m_): Likewise.     (mve_vornq_m_f): Likewise.     (mve_vornq_s): Likewise.     (mve_vornq_u): Likewise.     (mve_vorrq_): Likewise.     (mve_vorrq_f): Likewise.     (mve_vorrq_m_): Likewise.     (mve_vorrq_m_f): Likewise.     (mve_vorrq_m_n_): Likewise.     (mve_vorrq_n_): Likewise.     (mve_vorrq_s): Likewise.     (mve_vorrq_s): Likewise.     (mve_vqabsq_m_s): Likewise.     (mve_vqabsq_s): Likewise.     (mve_vqaddq_): Likewise.     (mve_vqaddq_m_): Likewise.     (mve_vqaddq_m_n_): Likewise.     (mve_vqaddq_n_): Likewise.     (mve_vqdmladhq_m_s): Likewise.     (mve_vqdmladhq_s): Likewise.     (mve_vqdmladhxq_m_s): Likewise.     (mve_vqdmladhxq_s): Likewise.     (mve_vqdmlahq_m_n_s): Likewise.     (mve_vqdmlahq_n_): Likewise.     (mve_vqdmlahq_n_s): Likewise.     (mve_vqdmlashq_m_n_s): Likewise.     (mve_vqdmlashq_n_): Likewise.     (mve_vqdmlashq_n_s): Likewise.     (mve_vqdmlsdhq_m_s): Likewise.     (mve_vqdmlsdhq_s): Likewise.     (mve_vqdmlsdhxq_m_s): Likewise.     (mve_vqdmlsdhxq_s): Likewise.     (mve_vqdmulhq_m_n_s): Likewise.     (mve_vqdmulhq_m_s): Likewise.     (mve_vqdmulhq_n_s): Likewise.     (mve_vqdmulhq_s): Likewise.     (mve_vqdmullbq_m_n_s): Likewise.     (mve_vqdmullbq_m_s): Likewise.     (mve_vqdmullbq_n_s): Likewise.     (mve_vqdmullbq_s): Likewise.     (mve_vqdmulltq_m_n_s): Likewise.     (mve_vqdmulltq_m_s): Likewise.     (mve_vqdmulltq_n_s): Likewise.     (mve_vqdmulltq_s): Likewise.     (mve_vqmovnbq_): Likewise.     (mve_vqmovnbq_m_): Likewise.     (mve_vqmovntq_): Likewise.     (mve_vqmovntq_m_): Likewise.     (mve_vqmovunbq_m_s): Likewise.     (mve_vqmovunbq_s): Likewise.     (mve_vqmovuntq_m_s): Likewise.     (mve_vqmovuntq_s): Likewise.     (mve_vqnegq_m_s): Likewise.     (mve_vqnegq_s): Likewise.     (mve_vqrdmladhq_m_s): Likewise.     (mve_vqrdmladhq_s): Likewise.     (mve_vqrdmladhxq_m_s): Likewise.     (mve_vqrdmladhxq_s): Likewise.     (mve_vqrdmlahq_m_n_s): Likewise.     (mve_vqrdmlahq_n_): Likewise.     (mve_vqrdmlahq_n_s): Likewise.     (mve_vqrdmlashq_m_n_s): Likewise.     (mve_vqrdmlashq_n_): Likewise.     (mve_vqrdmlashq_n_s): Likewise.     (mve_vqrdmlsdhq_m_s): Likewise.     (mve_vqrdmlsdhq_s): Likewise.     (mve_vqrdmlsdhxq_m_s): Likewise.     (mve_vqrdmlsdhxq_s): Likewise.     (mve_vqrdmulhq_m_n_s): Likewise.     (mve_vqrdmulhq_m_s): Likewise.     (mve_vqrdmulhq_n_s): Likewise.     (mve_vqrdmulhq_s): Likewise.     (mve_vqrshlq_): Likewise.     (mve_vqrshlq_m_): Likewise.     (mve_vqrshlq_m_n_): Likewise.     (mve_vqrshlq_n_): Likewise.     (mve_vqrshrnbq_m_n_): Likewise.     (mve_vqrshrnbq_n_): Likewise.     (mve_vqrshrntq_m_n_): Likewise.     (mve_vqrshrntq_n_): Likewise.     (mve_vqrshrunbq_m_n_s): Likewise.     (mve_vqrshrunbq_n_s): Likewise.     (mve_vqrshruntq_m_n_s): Likewise.     (mve_vqrshruntq_n_s): Likewise.     (mve_vqshlq_): Likewise.     (mve_vqshlq_m_): Likewise.     (mve_vqshlq_m_n_): Likewise.     (mve_vqshlq_m_r_): Likewise.     (mve_vqshlq_n_): Likewise.     (mve_vqshlq_r_): Likewise.     (mve_vqshluq_m_n_s): Likewise.     (mve_vqshluq_n_s): Likewise.     (mve_vqshrnbq_m_n_): Likewise.     (mve_vqshrnbq_n_): Likewise.     (mve_vqshrntq_m_n_): Likewise.     (mve_vqshrntq_n_): Likewise.     (mve_vqshrunbq_m_n_s): Likewise.     (mve_vqshrunbq_n_s): Likewise.     (mve_vqshruntq_m_n_s): Likewise.     (mve_vqshruntq_n_s): Likewise.     (mve_vqsubq_): Likewise.     (mve_vqsubq_m_): Likewise.     (mve_vqsubq_m_n_): Likewise.     (mve_vqsubq_n_): Likewise.     (mve_vrev16q_v16qi): Likewise.     (mve_vrev16q_m_v16qi): Likewise.     (mve_vrev32q_): Likewise.     (mve_vrev32q_fv8hf): Likewise.     (mve_vrev32q_m_): Likewise.     (mve_vrev32q_m_fv8hf): Likewise.     (mve_vrev64q_): Likewise.     (mve_vrev64q_f): Likewise.     (mve_vrev64q_m_): Likewise.     (mve_vrev64q_m_f): Likewise.     (mve_vrhaddq_): Likewise.     (mve_vrhaddq_m_): Likewise.     (mve_vrmlaldavhaq_v4si): Likewise.     (mve_vrmlaldavhaq_p_sv4si): Likewise.     (mve_vrmlaldavhaq_p_uv4si): Likewise.     (mve_vrmlaldavhaq_sv4si): Likewise.     (mve_vrmlaldavhaq_uv4si): Likewise.     (mve_vrmlaldavhaxq_p_sv4si): Likewise.     (mve_vrmlaldavhaxq_sv4si): Likewise.     (mve_vrmlaldavhq_v4si): Likewise.     (mve_vrmlaldavhq_p_v4si): Likewise.     (mve_vrmlaldavhxq_p_sv4si): Likewise.     (mve_vrmlaldavhxq_sv4si): Likewise.     (mve_vrmlsldavhaq_p_sv4si): Likewise.     (mve_vrmlsldavhaq_sv4si): Likewise.     (mve_vrmlsldavhaxq_p_sv4si): Likewise.     (mve_vrmlsldavhaxq_sv4si): Likewise.     (mve_vrmlsldavhq_p_sv4si): Likewise.     (mve_vrmlsldavhq_sv4si): Likewise.     (mve_vrmlsldavhxq_p_sv4si): Likewise.     (mve_vrmlsldavhxq_sv4si): Likewise.     (mve_vrmulhq_): Likewise.     (mve_vrmulhq_m_): Likewise.     (mve_vrndaq_f): Likewise.     (mve_vrndaq_m_f): Likewise.     (mve_vrndmq_f): Likewise.     (mve_vrndmq_m_f): Likewise.     (mve_vrndnq_f): Likewise.     (mve_vrndnq_m_f): Likewise.     (mve_vrndpq_f): Likewise.     (mve_vrndpq_m_f): Likewise.     (mve_vrndq_f): Likewise.     (mve_vrndq_m_f): Likewise.     (mve_vrndxq_f): Likewise.     (mve_vrndxq_m_f): Likewise.     (mve_vrshlq_): Likewise.     (mve_vrshlq_m_): Likewise.     (mve_vrshlq_m_n_): Likewise.     (mve_vrshlq_n_): Likewise.     (mve_vrshrnbq_m_n_): Likewise.     (mve_vrshrnbq_n_): Likewise.     (mve_vrshrntq_m_n_): Likewise.     (mve_vrshrntq_n_): Likewise.     (mve_vrshrq_m_n_): Likewise.     (mve_vrshrq_n_): Likewise.     (mve_vsbciq_v4si): Likewise.     (mve_vsbciq_m_v4si): Likewise.     (mve_vsbcq_v4si): Likewise.     (mve_vsbcq_m_v4si): Likewise.     (mve_vshlcq_): Likewise.     (mve_vshlcq_m_): Likewise.     (mve_vshllbq_m_n_): Likewise.     (mve_vshllbq_n_): Likewise.     (mve_vshlltq_m_n_): Likewise.     (mve_vshlltq_n_): Likewise.     (mve_vshlq_): Likewise.     (mve_vshlq_): Likewise.     (mve_vshlq_m_): Likewise.     (mve_vshlq_m_n_): Likewise.     (mve_vshlq_m_r_): Likewise.     (mve_vshlq_n_): Likewise.     (mve_vshlq_r_): Likewise.     (mve_vshrnbq_m_n_): Likewise.     (mve_vshrnbq_n_): Likewise.     (mve_vshrntq_m_n_): Likewise.     (mve_vshrntq_n_): Likewise.     (mve_vshrq_m_n_): Likewise.     (mve_vshrq_n_): Likewise.     (mve_vsliq_m_n_): Likewise.     (mve_vsliq_n_): Likewise.     (mve_vsriq_m_n_): Likewise.     (mve_vsriq_n_): Likewise.     (mve_vstrbq_): Likewise.     (mve_vstrbq_p_): Likewise.     (mve_vstrbq_scatter_offset__insn): Likewise.     (mve_vstrbq_scatter_offset_p__insn): Likewise.     (mve_vstrdq_scatter_base_v2di): Likewise.     (mve_vstrdq_scatter_base_p_v2di): Likewise.     (mve_vstrdq_scatter_base_wb_v2di): Likewise.     (mve_vstrdq_scatter_base_wb_p_v2di): Likewise.     (mve_vstrdq_scatter_offset_v2di_insn): Likewise.     (mve_vstrdq_scatter_offset_p_v2di_insn): Likewise.     (mve_vstrdq_scatter_shifted_offset_v2di_insn): Likewise.     (mve_vstrdq_scatter_shifted_offset_p_v2di_insn): Likewise.     (mve_vstrhq_): Likewise.     (mve_vstrhq_fv8hf): Likewise.     (mve_vstrhq_p_): Likewise.     (mve_vstrhq_p_fv8hf): Likewise.     (mve_vstrhq_scatter_offset__insn): Likewise.     (mve_vstrhq_scatter_offset_fv8hf_insn): Likewise.     (mve_vstrhq_scatter_offset_p__insn): Likewise.     (mve_vstrhq_scatter_offset_p_fv8hf_insn): Likewise.  (mve_vstrhq_scatter_shifted_offset__insn): Likewise.     (mve_vstrhq_scatter_shifted_offset_fv8hf_insn): Likewise.  (mve_vstrhq_scatter_shifted_offset_p__insn): Likewise.     (mve_vstrhq_scatter_shifted_offset_p_fv8hf_insn): Likewise.     (mve_vstrwq_v4si): Likewise.     (mve_vstrwq_fv4sf): Likewise.     (mve_vstrwq_p_v4si): Likewise.     (mve_vstrwq_p_fv4sf): Likewise.     (mve_vstrwq_scatter_base_v4si): Likewise.     (mve_vstrwq_scatter_base_fv4sf): Likewise.     (mve_vstrwq_scatter_base_p_v4si): Likewise.     (mve_vstrwq_scatter_base_p_fv4sf): Likewise.     (mve_vstrwq_scatter_base_wb_v4si): Likewise.     (mve_vstrwq_scatter_base_wb_fv4sf): Likewise.     (mve_vstrwq_scatter_base_wb_p_v4si): Likewise.     (mve_vstrwq_scatter_base_wb_p_fv4sf): Likewise.     (mve_vstrwq_scatter_offset_v4si_insn): Likewise.     (mve_vstrwq_scatter_offset_fv4sf_insn): Likewise.     (mve_vstrwq_scatter_offset_p_v4si_insn): Likewise.     (mve_vstrwq_scatter_offset_p_fv4sf_insn): Likewise.     (mve_vstrwq_scatter_shifted_offset_v4si_insn): Likewise.     (mve_vstrwq_scatter_shifted_offset_fv4sf_insn): Likewise.     (mve_vstrwq_scatter_shifted_offset_p_v4si_insn): Likewise.     (mve_vstrwq_scatter_shifted_offset_p_fv4sf_insn): Likewise.     (mve_vsubq_): Likewise.     (mve_vsubq_f): Likewise.     (mve_vsubq_m_): Likewise.     (mve_vsubq_m_f): Likewise.     (mve_vsubq_m_n_): Likewise.     (mve_vsubq_m_n_f): Likewise.     (mve_vsubq_n_): Likewise.     (mve_vsubq_n_f): Likewise. gcc/testsuite/ChangeLog:         * gcc.target/arm/dlstp-compile-asm.c: New test. diff --git a/gcc/config/arm/arm.h b/gcc/config/arm/arm.h index 984ab789285dd0bb8d648fda89053b24ada93698..753a926a3efc3b8b39f094610b92457e58871b2a 100644 --- a/gcc/config/arm/arm.h +++ b/gcc/config/arm/arm.h @@ -2331,6 +2331,21 @@ extern int making_const_table; else if (TARGET_THUMB1) \ thumb1_final_prescan_insn (INSN) +/* These defines are useful to refer to the value of the mve_unpredicated_insn + insn attribute. Note that, because these use the get_attr_* function, these + will change recog_data if (INSN) isn't current_insn. */ +#define MVE_VPT_PREDICATED_INSN_P(INSN) \ + (recog_memoized (INSN) >= 0 \ + && recog_memoized (INSN) != get_attr_mve_unpredicated_insn (INSN)) \ + +#define MVE_VPT_UNPREDICATED_INSN_P(INSN) \ + (recog_memoized (INSN) >= 0 \ + && recog_memoized (INSN) == get_attr_mve_unpredicated_insn (INSN)) \ + +#define MVE_VPT_PREDICABLE_INSN_P(INSN) \ + (recog_memoized (INSN) >= 0 \ + && get_attr_mve_unpredicated_insn (INSN) != 0) \ + #define ARM_SIGN_EXTEND(x) ((HOST_WIDE_INT) \ (HOST_BITS_PER_WIDE_INT <= 32 ? (unsigned HOST_WIDE_INT) (x) \ : ((((unsigned HOST_WIDE_INT)(x)) & (unsigned HOST_WIDE_INT) 0xffffffff) |\ diff --git a/gcc/config/arm/arm.md b/gcc/config/arm/arm.md index 69bf343fb0ed601014979cfc1803abe84c87f179..e1d2e62593085accfcc111cf6fa5795e4520f213 100644 --- a/gcc/config/arm/arm.md +++ b/gcc/config/arm/arm.md @@ -123,6 +123,8 @@ ; and not all ARM insns do. (define_attr "predicated" "yes,no" (const_string "no")) +(define_attr "mve_unpredicated_insn" "" (const_int 0)) + ; LENGTH of an instruction (in bytes) (define_attr "length" "" (const_int 4)) diff --git a/gcc/config/arm/mve.md b/gcc/config/arm/mve.md index 0c21db100266129ebbfa861e09e32113d102bf02..03f816ff0f28b27c8835f19894889fa6066c1ad2 100644 --- a/gcc/config/arm/mve.md +++ b/gcc/config/arm/mve.md @@ -142,7 +142,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintzt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -156,7 +157,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrintx.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -170,7 +172,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrintz.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -184,7 +187,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrintp.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -198,7 +202,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrintn.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -212,7 +217,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrintm.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -226,7 +232,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrinta.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -240,7 +247,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrev64.%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f")) + (set_attr "type" "mve_move") ]) ;; @@ -253,7 +261,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vneg.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -267,7 +276,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vdup.%#\t%q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -280,7 +290,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vabs.f%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -294,7 +305,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vrev32.16 %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf")) + (set_attr "type" "mve_move") ]) ;; ;; [vcvttq_f32_f16]) @@ -307,7 +319,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtt.f32.f16 %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) + (set_attr "type" "mve_move") ]) ;; @@ -321,7 +334,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtb.f32.f16 %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) + (set_attr "type" "mve_move") ]) ;; @@ -335,7 +349,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.f%#.%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) + (set_attr "type" "mve_move") ]) ;; @@ -349,7 +364,8 @@ ] "TARGET_HAVE_MVE" "vrev64.%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_")) + (set_attr "type" "mve_move") ]) ;; @@ -363,7 +379,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.%#.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) + (set_attr "type" "mve_move") ]) ;; [vqnegq_s]) ;; @@ -375,7 +392,8 @@ ] "TARGET_HAVE_MVE" "vqneg.s%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -389,7 +407,8 @@ ] "TARGET_HAVE_MVE" "vqabs.s%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -402,7 +421,8 @@ ] "TARGET_HAVE_MVE" "vneg.s%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -415,7 +435,8 @@ ] "TARGET_HAVE_MVE" "vmvn\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vmvnq_s" [ @@ -436,7 +457,8 @@ ] "TARGET_HAVE_MVE" "vdup.%#\t%q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -449,7 +471,8 @@ ] "TARGET_HAVE_MVE" "vclz.i%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vclzq_u" [ @@ -470,7 +493,8 @@ ] "TARGET_HAVE_MVE" "vcls.s%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -484,7 +508,8 @@ ] "TARGET_HAVE_MVE" "vaddv.%#\t%0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_")) + (set_attr "type" "mve_move") ]) ;; @@ -497,7 +522,8 @@ ] "TARGET_HAVE_MVE" "vabs.s%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -511,7 +537,8 @@ ] "TARGET_HAVE_MVE" "vrev32.%#\t%q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_")) + (set_attr "type" "mve_move") ]) ;; @@ -525,7 +552,8 @@ ] "TARGET_HAVE_MVE" "vmovlt.%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_")) + (set_attr "type" "mve_move") ]) ;; @@ -539,7 +567,8 @@ ] "TARGET_HAVE_MVE" "vmovlb.%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_")) + (set_attr "type" "mve_move") ]) ;; @@ -553,7 +582,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtp.%#.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) + (set_attr "type" "mve_move") ]) ;; @@ -567,7 +597,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtn.%#.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) + (set_attr "type" "mve_move") ]) ;; @@ -581,7 +612,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtm.%#.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) + (set_attr "type" "mve_move") ]) ;; @@ -595,7 +627,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvta.%#.f%# %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) + (set_attr "type" "mve_move") ]) ;; @@ -609,7 +642,8 @@ ] "TARGET_HAVE_MVE" "vmvn.i%# %q0, %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -623,7 +657,8 @@ ] "TARGET_HAVE_MVE" "vrev16.8 %q0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_v16qi")) + (set_attr "type" "mve_move") ]) ;; @@ -637,7 +672,8 @@ ] "TARGET_HAVE_MVE" "vaddlv.32\t%Q0, %R0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -651,7 +687,8 @@ ] "TARGET_HAVE_MVE" "vctp. %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpqhi")) + (set_attr "type" "mve_move") ]) ;; @@ -680,7 +717,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vsub.f\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -695,7 +733,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vbrsr. %q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -710,7 +749,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt.f.\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) + (set_attr "type" "mve_move") ]) ;; [vcreateq_f]) @@ -755,7 +795,8 @@ ] "TARGET_HAVE_MVE" "vshr.\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_")) + (set_attr "type" "mve_move") ]) ;; Versions that take constant vectors as operand 2 (with all elements @@ -803,7 +844,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvt..f\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) + (set_attr "type" "mve_move") ]) ;; @@ -818,8 +860,9 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddlvt.32\t%Q0, %R0, %q1" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvq_v4si")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vcmpneq_, vcmpcsq_, vcmpeqq_, vcmpgeq_, vcmpgtq_, vcmphiq_, vcmpleq_, vcmpltq_]) @@ -832,7 +875,8 @@ ] "TARGET_HAVE_MVE" "vcmp.%#\t, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_")) + (set_attr "type" "mve_move") ]) ;; @@ -847,7 +891,8 @@ ] "TARGET_HAVE_MVE" "vcmp.%# , %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -866,7 +911,8 @@ ] "TARGET_HAVE_MVE" "vabd.%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_")) + (set_attr "type" "mve_move") ]) ;; @@ -881,7 +927,8 @@ ] "TARGET_HAVE_MVE" "vadd.i%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -896,7 +943,8 @@ ] "TARGET_HAVE_MVE" "vaddva.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_")) + (set_attr "type" "mve_move") ]) ;; @@ -911,7 +959,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddvt.%# %0, %q1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -931,7 +980,8 @@ "@ vand\t%q0, %q1, %q2 * return neon_output_logic_immediate (\"vand\", &operands[2], mode, 1, VALID_NEON_QREG_MODE (mode));" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vandq_s" [ @@ -953,7 +1003,8 @@ ] "TARGET_HAVE_MVE" "vbic\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vbicq_s" @@ -977,7 +1028,8 @@ ] "TARGET_HAVE_MVE" "vbrsr.%# %q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -992,7 +1044,8 @@ ] "TARGET_HAVE_MVE" "vcadd.i%# %q0, %q1, %q2, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq")) + (set_attr "type" "mve_move") ]) ;; Auto vectorizer pattern for int vcadd @@ -1015,7 +1068,8 @@ ] "TARGET_HAVE_MVE" "veor\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_u")) + (set_attr "type" "mve_move") ]) (define_expand "mve_veorq_s" [ @@ -1038,7 +1092,8 @@ ] "TARGET_HAVE_MVE" "vhadd.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1053,7 +1108,8 @@ ] "TARGET_HAVE_MVE" "vhadd.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1068,7 +1124,8 @@ ] "TARGET_HAVE_MVE" "vhcadd.s%#\t%q0, %q1, %q2, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1083,7 +1140,8 @@ ] "TARGET_HAVE_MVE" "vhcadd.s%#\t%q0, %q1, %q2, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1098,7 +1156,8 @@ ] "TARGET_HAVE_MVE" "vhsub.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1113,7 +1172,8 @@ ] "TARGET_HAVE_MVE" "vhsub.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1128,7 +1188,8 @@ ] "TARGET_HAVE_MVE" "vmaxa.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1143,7 +1204,8 @@ ] "TARGET_HAVE_MVE" "vmaxav.s%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1157,7 +1219,8 @@ ] "TARGET_HAVE_MVE" "vmax.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_s")) + (set_attr "type" "mve_move") ]) (define_insn "mve_vmaxq_u" @@ -1168,7 +1231,8 @@ ] "TARGET_HAVE_MVE" "vmax.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_u")) + (set_attr "type" "mve_move") ]) ;; @@ -1183,7 +1247,8 @@ ] "TARGET_HAVE_MVE" "vmaxv.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1198,7 +1263,8 @@ ] "TARGET_HAVE_MVE" "vmina.s%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1213,7 +1279,8 @@ ] "TARGET_HAVE_MVE" "vminav.s%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1227,7 +1294,8 @@ ] "TARGET_HAVE_MVE" "vmin.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_s")) + (set_attr "type" "mve_move") ]) (define_insn "mve_vminq_u" @@ -1238,7 +1306,8 @@ ] "TARGET_HAVE_MVE" "vmin.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_u")) + (set_attr "type" "mve_move") ]) ;; @@ -1253,7 +1322,8 @@ ] "TARGET_HAVE_MVE" "vminv.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1268,7 +1338,8 @@ ] "TARGET_HAVE_MVE" "vmladav.%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1283,7 +1354,8 @@ ] "TARGET_HAVE_MVE" "vmladavx.s%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1298,7 +1370,8 @@ ] "TARGET_HAVE_MVE" "vmlsdav.s%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1313,7 +1386,8 @@ ] "TARGET_HAVE_MVE" "vmlsdavx.s%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1328,7 +1402,8 @@ ] "TARGET_HAVE_MVE" "vmulh.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1343,7 +1418,8 @@ ] "TARGET_HAVE_MVE" "vmullb.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_")) + (set_attr "type" "mve_move") ]) ;; @@ -1358,7 +1434,8 @@ ] "TARGET_HAVE_MVE" "vmullt.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_")) + (set_attr "type" "mve_move") ]) ;; @@ -1373,7 +1450,8 @@ ] "TARGET_HAVE_MVE" "vmul.i%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1388,7 +1466,8 @@ ] "TARGET_HAVE_MVE" "vmul.i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_")) + (set_attr "type" "mve_move") ]) (define_insn "mve_vmulq" @@ -1399,7 +1478,8 @@ ] "TARGET_HAVE_MVE" "vmul.i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq")) + (set_attr "type" "mve_move") ]) ;; @@ -1413,7 +1493,8 @@ ] "TARGET_HAVE_MVE" "vorn\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vornq_u" @@ -1442,7 +1523,8 @@ "@ vorr\t%q0, %q1, %q2 * return neon_output_logic_immediate (\"vorr\", &operands[2], mode, 0, VALID_NEON_QREG_MODE (mode));" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s")) + (set_attr "type" "mve_move") ]) (define_expand "mve_vorrq_u" [ @@ -1465,7 +1547,8 @@ ] "TARGET_HAVE_MVE" "vqadd.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1480,7 +1563,8 @@ ] "TARGET_HAVE_MVE" "vqadd.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1495,7 +1579,8 @@ ] "TARGET_HAVE_MVE" "vqdmulh.s%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1510,7 +1595,8 @@ ] "TARGET_HAVE_MVE" "vqdmulh.s%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1525,7 +1611,8 @@ ] "TARGET_HAVE_MVE" "vqrdmulh.s%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1540,7 +1627,8 @@ ] "TARGET_HAVE_MVE" "vqrdmulh.s%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1555,7 +1643,8 @@ ] "TARGET_HAVE_MVE" "vqrshl.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1570,7 +1659,8 @@ ] "TARGET_HAVE_MVE" "vqrshl.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1585,7 +1675,8 @@ ] "TARGET_HAVE_MVE" "vqshl.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1600,7 +1691,8 @@ ] "TARGET_HAVE_MVE" "vqshl.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_")) + (set_attr "type" "mve_move") ]) ;; @@ -1615,7 +1707,8 @@ ] "TARGET_HAVE_MVE" "vqshl.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1630,7 +1723,8 @@ ] "TARGET_HAVE_MVE" "vqshlu.s%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1645,7 +1739,8 @@ ] "TARGET_HAVE_MVE" "vqsub.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1660,7 +1755,8 @@ ] "TARGET_HAVE_MVE" "vqsub.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1675,7 +1771,8 @@ ] "TARGET_HAVE_MVE" "vrhadd.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1690,7 +1787,8 @@ ] "TARGET_HAVE_MVE" "vrmulh.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1705,7 +1803,8 @@ ] "TARGET_HAVE_MVE" "vrshl.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1720,7 +1819,8 @@ ] "TARGET_HAVE_MVE" "vrshl.%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_")) + (set_attr "type" "mve_move") ]) ;; @@ -1735,7 +1835,8 @@ ] "TARGET_HAVE_MVE" "vrshr.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1750,7 +1851,8 @@ ] "TARGET_HAVE_MVE" "vshl.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1765,7 +1867,8 @@ ] "TARGET_HAVE_MVE" "vshl.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_")) + (set_attr "type" "mve_move") ]) ;; @@ -1780,7 +1883,8 @@ ] "TARGET_HAVE_MVE" "vsub.i%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1795,7 +1899,8 @@ ] "TARGET_HAVE_MVE" "vsub.i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_")) + (set_attr "type" "mve_move") ]) (define_insn "mve_vsubq" @@ -1806,7 +1911,8 @@ ] "TARGET_HAVE_MVE" "vsub.i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -1821,7 +1927,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vabd.f%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1836,7 +1943,8 @@ ] "TARGET_HAVE_MVE" "vaddlva.32\t%Q0, %R0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -1851,7 +1959,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vadd.f%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1865,7 +1974,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vand %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1879,7 +1989,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vbic %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1894,7 +2005,8 @@ ] "TARGET_HAVE_MVE" "vbic.i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -1909,7 +2021,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcadd.f%# %q0, %q1, %q2, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq")) + (set_attr "type" "mve_move") ]) ;; @@ -1923,7 +2036,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcmp.f%# , %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1938,7 +2052,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcmp.f%# , %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -1953,7 +2068,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcmul.f%# %q0, %q1, %q2, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq")) + (set_attr "type" "mve_move") ]) ;; @@ -1968,7 +2084,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vctpt. %1" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vctpqhi")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -1983,7 +2100,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtb.f16.f32 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) + (set_attr "type" "mve_move") ]) ;; @@ -1998,7 +2116,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vcvtt.f16.f32 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) + (set_attr "type" "mve_move") ]) ;; @@ -2012,7 +2131,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "veor %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2027,7 +2147,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmaxnma.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2042,7 +2163,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmaxnmav.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2056,7 +2178,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmaxnm.f%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2071,7 +2194,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmaxnmv.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2086,7 +2210,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vminnma.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2101,7 +2226,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vminnmav.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2115,7 +2241,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vminnm.f%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2130,7 +2257,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vminnmv.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2145,7 +2273,8 @@ ] "TARGET_HAVE_MVE" "vmlaldav.%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2160,7 +2289,8 @@ ] "TARGET_HAVE_MVE" "vmlaldavx.s%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2175,7 +2305,8 @@ ] "TARGET_HAVE_MVE" "vmlsldav.s%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2190,7 +2321,8 @@ ] "TARGET_HAVE_MVE" "vmlsldavx.s%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2205,7 +2337,8 @@ ] "TARGET_HAVE_MVE" "vmovnb.i%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2220,7 +2353,8 @@ ] "TARGET_HAVE_MVE" "vmovnt.i%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2234,7 +2368,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmul.f%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2249,7 +2384,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vmul.f%# %q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2263,7 +2399,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vorn %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2277,7 +2414,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vorr %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2292,7 +2430,8 @@ ] "TARGET_HAVE_MVE" "vorr.i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -2307,7 +2446,8 @@ ] "TARGET_HAVE_MVE" "vqdmullb.s%# %q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2322,7 +2462,8 @@ ] "TARGET_HAVE_MVE" "vqdmullb.s%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2337,7 +2478,8 @@ ] "TARGET_HAVE_MVE" "vqdmullt.s%# %q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2352,7 +2494,8 @@ ] "TARGET_HAVE_MVE" "vqdmullt.s%# %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2367,7 +2510,8 @@ ] "TARGET_HAVE_MVE" "vqmovnb.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2382,7 +2526,8 @@ ] "TARGET_HAVE_MVE" "vqmovnt.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2397,7 +2542,8 @@ ] "TARGET_HAVE_MVE" "vqmovunb.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2412,7 +2558,8 @@ ] "TARGET_HAVE_MVE" "vqmovunt.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -2427,7 +2574,8 @@ ] "TARGET_HAVE_MVE" "vrmlaldavhx.s32 %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -2442,7 +2590,8 @@ ] "TARGET_HAVE_MVE" "vrmlsldavh.s32\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -2457,7 +2606,8 @@ ] "TARGET_HAVE_MVE" "vrmlsldavhx.s32\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -2472,7 +2622,8 @@ ] "TARGET_HAVE_MVE" "vshllb.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -2487,7 +2638,8 @@ ] "TARGET_HAVE_MVE" "vshllt.%#\t%q0, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -2501,7 +2653,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vsub.f%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -2516,7 +2669,8 @@ ] "TARGET_HAVE_MVE" "vmullt.p%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p")) + (set_attr "type" "mve_move") ]) ;; @@ -2531,7 +2685,8 @@ ] "TARGET_HAVE_MVE" "vmullb.p%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p")) + (set_attr "type" "mve_move") ]) ;; @@ -2546,7 +2701,8 @@ ] "TARGET_HAVE_MVE" "vrmlaldavh.32\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -2562,7 +2718,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vbict.i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vcmpeqq_m_f]) @@ -2577,7 +2734,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# eq, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vcvtaq_m_u, vcvtaq_m_s]) @@ -2592,7 +2750,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtat.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtaq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vcvtq_m_to_f_s, vcvtq_m_to_f_u]) @@ -2607,7 +2766,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_to_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vqrshrnbq_n_u, vqrshrnbq_n_s]) @@ -2622,7 +2782,8 @@ ] "TARGET_HAVE_MVE" "vqrshrnb.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_")) + (set_attr "type" "mve_move") ]) ;; ;; [vqrshrunbq_n_s]) @@ -2637,7 +2798,8 @@ ] "TARGET_HAVE_MVE" "vqrshrunb.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s")) + (set_attr "type" "mve_move") ]) ;; ;; [vrmlaldavhaq_s vrmlaldavhaq_u]) @@ -2652,7 +2814,8 @@ ] "TARGET_HAVE_MVE" "vrmlaldavha.32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_v4si")) + (set_attr "type" "mve_move") ]) ;; @@ -2668,7 +2831,8 @@ ] "TARGET_HAVE_MVE" "vabav.%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_")) + (set_attr "type" "mve_move") ]) ;; @@ -2729,7 +2893,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vabst.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2745,7 +2910,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddvat.%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddvaq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2761,7 +2927,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vclst.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclsq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2777,7 +2944,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vclzt.i%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vclzq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2793,7 +2961,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.u%# cs, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2809,7 +2978,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.u%# cs, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpcsq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2825,7 +2995,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.i%# eq, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2841,7 +3012,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.i%# eq, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2857,7 +3029,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# ge, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2873,7 +3046,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# ge, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2889,7 +3063,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# gt, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2905,7 +3080,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# gt, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2921,7 +3097,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.u%# hi, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2937,7 +3114,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.u%# hi, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmphiq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2953,7 +3131,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# le, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2969,7 +3148,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# le, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -2985,7 +3165,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# lt, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3001,7 +3182,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.s%# lt, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3017,7 +3199,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.i%# ne, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3033,7 +3216,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcmpt.i%# ne, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3049,8 +3233,9 @@ ] "TARGET_HAVE_MVE" "vpst\;vdupt.%#\t%q0, %2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vmaxaq_m_s]) @@ -3065,7 +3250,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmaxat.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxaq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3081,7 +3267,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmaxavt.s%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxavq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3097,7 +3284,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmaxvt.%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxvq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3113,7 +3301,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vminat.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminaq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3129,7 +3318,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vminavt.s%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminavq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3145,7 +3335,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vminvt.%#\t%0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminvq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3161,7 +3352,8 @@ ] "TARGET_HAVE_MVE" "vmladava.%# %0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_")) + (set_attr "type" "mve_move") ]) ;; @@ -3177,7 +3369,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmladavt.%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3193,7 +3386,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmladavxt.s%#\t%0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3209,7 +3403,8 @@ ] "TARGET_HAVE_MVE" "vmla.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3225,7 +3420,8 @@ ] "TARGET_HAVE_MVE" "vmlas.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3241,7 +3437,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsdavt.s%# %0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3257,7 +3454,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsdavxt.s%# %0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3273,7 +3471,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmvnt %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3289,7 +3488,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vnegt.s%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3305,7 +3505,8 @@ ] "TARGET_HAVE_MVE" "vpsel %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpselq_")) + (set_attr "type" "mve_move") ]) ;; @@ -3321,7 +3522,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqabst.s%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqabsq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3337,7 +3539,8 @@ ] "TARGET_HAVE_MVE" "vqdmlah.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3353,7 +3556,8 @@ ] "TARGET_HAVE_MVE" "vqdmlash.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3369,7 +3573,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqnegt.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqnegq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3385,7 +3590,8 @@ ] "TARGET_HAVE_MVE" "vqrdmladh.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3401,7 +3607,8 @@ ] "TARGET_HAVE_MVE" "vqrdmladhx.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3417,7 +3624,8 @@ ] "TARGET_HAVE_MVE" "vqrdmlah.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3433,7 +3641,8 @@ ] "TARGET_HAVE_MVE" "vqrdmlash.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3449,7 +3658,8 @@ ] "TARGET_HAVE_MVE" "vqrdmlsdh.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3465,7 +3675,8 @@ ] "TARGET_HAVE_MVE" "vqrdmlsdhx.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3481,7 +3692,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshlt.%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3497,7 +3709,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshlt.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_r_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3513,7 +3726,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrev64t.%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3529,7 +3743,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrshlt.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3545,7 +3760,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshlt.%#\t%q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_r_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3561,7 +3777,8 @@ ] "TARGET_HAVE_MVE" "vsli.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3577,7 +3794,8 @@ ] "TARGET_HAVE_MVE" "vsri.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -3593,7 +3811,8 @@ ] "TARGET_HAVE_MVE" "vqdmlsdhx.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3609,7 +3828,8 @@ ] "TARGET_HAVE_MVE" "vqdmlsdh.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3625,7 +3845,8 @@ ] "TARGET_HAVE_MVE" "vqdmladhx.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3641,7 +3862,8 @@ ] "TARGET_HAVE_MVE" "vqdmladh.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3657,7 +3879,8 @@ ] "TARGET_HAVE_MVE" "vmlsdavax.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3673,7 +3896,8 @@ ] "TARGET_HAVE_MVE" "vmlsdava.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -3689,7 +3913,8 @@ ] "TARGET_HAVE_MVE" "vmladavax.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s")) + (set_attr "type" "mve_move") ]) ;; ;; [vabsq_m_f]) @@ -3704,7 +3929,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vabst.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabsq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3720,8 +3946,10 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddlvat.32\t%Q0, %R0, %q2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddlvaq_v4si")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) + ;; ;; [vcmlaq, vcmlaq_rot90, vcmlaq_rot180, vcmlaq_rot270]) ;; @@ -3738,7 +3966,8 @@ "@ vcmul.f%# %q0, %q2, %q3, # vcmla.f%# %q0, %q2, %q3, #" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq")) + (set_attr "type" "mve_move") ]) ;; @@ -3754,7 +3983,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# eq, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpeqq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3770,7 +4000,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# ge, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3786,7 +4017,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# ge, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgeq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3802,7 +4034,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# gt, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3818,7 +4051,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# gt, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpgtq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3834,7 +4068,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# le, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3850,7 +4085,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# le, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpleq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3866,7 +4102,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# lt, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3882,7 +4119,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# lt, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpltq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3898,7 +4136,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# ne, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3914,7 +4153,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmpt.f%# ne, %q1, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmpneq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3930,7 +4170,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtbt.f16.f32 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f16_f32v8hf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3946,7 +4187,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtbt.f32.f16 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtbq_f32_f16v4sf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3962,7 +4204,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvttt.f16.f32 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f16_f32v8hf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3978,7 +4221,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvttt.f32.f16 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvttq_f32_f16v4sf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -3994,8 +4238,9 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vdupt.%#\t%q0, %2" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdupq_n_f")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vfmaq_f]) @@ -4010,7 +4255,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vfma.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -4026,7 +4272,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vfma.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f")) + (set_attr "type" "mve_move") ]) ;; @@ -4042,7 +4289,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vfmas.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f")) + (set_attr "type" "mve_move") ]) ;; ;; [vfmsq_f]) @@ -4057,7 +4305,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vfms.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -4073,7 +4322,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmaxnmat.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmaq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vmaxnmavq_p_f]) @@ -4088,7 +4338,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmaxnmavt.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmavq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4104,7 +4355,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmaxnmvt.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmvq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vminnmaq_m_f]) @@ -4119,7 +4371,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vminnmat.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmaq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4135,7 +4388,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vminnmavt.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmavq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vminnmvq_p_f]) @@ -4150,7 +4404,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vminnmvt.f%# %0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmvq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4166,7 +4421,8 @@ ] "TARGET_HAVE_MVE" "vmlaldava.%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_")) + (set_attr "type" "mve_move") ]) ;; @@ -4182,7 +4438,8 @@ ] "TARGET_HAVE_MVE" "vmlaldavax.s%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4198,7 +4455,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlaldavt.%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4214,7 +4472,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlaldavxt.s%#\t%Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vmlsldavaq_s]) @@ -4229,7 +4488,8 @@ ] "TARGET_HAVE_MVE" "vmlsldava.s%# %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4245,7 +4505,8 @@ ] "TARGET_HAVE_MVE" "vmlsldavax.s%# %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4261,7 +4522,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsldavt.s%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4277,7 +4539,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsldavxt.s%# %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vmovlbq_m_u, vmovlbq_m_s]) @@ -4292,7 +4555,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmovlbt.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovlbq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vmovltq_m_u, vmovltq_m_s]) @@ -4307,7 +4571,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmovltt.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovltq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vmovnbq_m_u, vmovnbq_m_s]) @@ -4322,7 +4587,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmovnbt.i%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovnbq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4338,7 +4604,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmovntt.i%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmovntq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4354,7 +4621,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmvnt.i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmvnq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vnegq_m_f]) @@ -4369,7 +4637,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vnegt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vnegq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4385,7 +4654,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vorrt.i%# %q0, %2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vpselq_f]) @@ -4400,7 +4670,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpsel %q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vpselq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -4416,7 +4687,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqmovnbt.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovnbq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4432,7 +4704,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqmovntt.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovntq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4448,7 +4721,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqmovunbt.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovunbq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4464,7 +4738,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqmovuntt.s%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqmovuntq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4480,7 +4755,8 @@ ] "TARGET_HAVE_MVE" "vqrshrnt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4496,7 +4772,8 @@ ] "TARGET_HAVE_MVE" "vqrshrunt.s%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4512,7 +4789,8 @@ ] "TARGET_HAVE_MVE" "vqshrnb.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4528,7 +4806,8 @@ ] "TARGET_HAVE_MVE" "vqshrnt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4544,7 +4823,8 @@ ] "TARGET_HAVE_MVE" "vqshrunb.s%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4560,7 +4840,8 @@ ] "TARGET_HAVE_MVE" "vqshrunt.s%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s")) + (set_attr "type" "mve_move") ]) ;; @@ -4576,7 +4857,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrev32t.16 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_fv8hf")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4592,7 +4874,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrev32t.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev32q_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4608,7 +4891,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrev64t.%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev64q_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4624,7 +4908,8 @@ ] "TARGET_HAVE_MVE" "vrmlaldavhax.s32 %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -4640,7 +4925,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlaldavhxt.s32 %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhxq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4656,7 +4942,8 @@ ] "TARGET_HAVE_MVE" "vrmlsldavhax.s32 %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -4672,7 +4959,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlsldavht.s32 %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4688,7 +4976,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlsldavhxt.s32 %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhxq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4704,7 +4993,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintat.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndaq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4720,7 +5010,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintmt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndmq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4736,7 +5027,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintnt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndnq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4752,7 +5044,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintpt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndpq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4768,7 +5061,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vrintxt.f%# %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrndxq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4784,7 +5078,8 @@ ] "TARGET_HAVE_MVE" "vrshrnb.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4800,7 +5095,8 @@ ] "TARGET_HAVE_MVE" "vrshrnt.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4816,7 +5112,8 @@ ] "TARGET_HAVE_MVE" "vshrnb.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4832,7 +5129,8 @@ ] "TARGET_HAVE_MVE" "vshrnt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_")) + (set_attr "type" "mve_move") ]) ;; @@ -4848,7 +5146,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtmt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtmq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4864,7 +5163,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtpt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtpq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4880,7 +5180,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtnt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtnq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4897,7 +5198,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_from_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4913,7 +5215,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrev16t.8 %q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrev16q_v16qi")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4929,7 +5232,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.%#.f%#\t%q0, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_from_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4945,7 +5249,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlaldavht.32 %Q0, %R0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhq_v4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -4961,7 +5266,8 @@ ] "TARGET_HAVE_MVE" "vrmlsldavha.s32 %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si")) + (set_attr "type" "mve_move") ]) ;; @@ -4978,7 +5284,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vabavt.%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabavq_")) + (set_attr "type" "mve_move") ]) ;; @@ -4995,7 +5302,8 @@ ] "TARGET_HAVE_MVE" "vpst\n\tvqshlut.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshluq_n_s")) + (set_attr "type" "mve_move")]) ;; ;; [vshlq_m_s, vshlq_m_u]) @@ -5011,7 +5319,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshlt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_")) + (set_attr "type" "mve_move")]) ;; ;; [vsriq_m_n_s, vsriq_m_n_u]) @@ -5027,7 +5336,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vsrit.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsriq_n_")) + (set_attr "type" "mve_move")]) ;; ;; [vsubq_m_u, vsubq_m_s]) @@ -5043,7 +5353,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vsubt.i%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_")) + (set_attr "type" "mve_move")]) ;; ;; [vcvtq_m_n_to_f_u, vcvtq_m_n_to_f_s]) @@ -5059,7 +5370,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcvtt.f%#.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcvtq_n_to_f_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vabdq_m_s, vabdq_m_u]) @@ -5075,7 +5387,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vabdt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5092,7 +5405,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddt.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5109,7 +5423,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vaddt.i%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5126,7 +5441,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vandt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5143,7 +5459,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vbict %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5160,7 +5477,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vbrsrt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5177,7 +5495,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcaddt.i%# %q0, %q2, %q3, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5194,7 +5513,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vcaddt.i%# %q0, %q2, %q3, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5211,7 +5531,8 @@ ] "TARGET_HAVE_MVE" "vpst\;veort %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5228,7 +5549,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhaddt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5245,7 +5567,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhaddt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhaddq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5262,7 +5585,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhsubt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5279,7 +5603,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhsubt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhsubq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5296,7 +5621,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmaxt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5313,7 +5639,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmint.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5330,7 +5657,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmladavat.%# %0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5347,7 +5675,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlat.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5364,7 +5693,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlast.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlasq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5381,7 +5711,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmulht.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulhq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5398,7 +5729,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmullbt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_int_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5415,7 +5747,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmulltt.%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_int_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5432,7 +5765,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmult.i%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5449,7 +5783,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmult.i%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5466,7 +5801,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vornt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5483,7 +5819,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vorrt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5500,7 +5837,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqaddt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5517,7 +5855,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqaddt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqaddq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5534,7 +5873,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmlaht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlahq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5551,7 +5891,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmlasht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlashq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5568,7 +5909,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmlaht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlahq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5585,7 +5927,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmlasht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlashq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5602,7 +5945,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshlt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshlq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5619,7 +5963,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshlt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5636,7 +5981,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshlt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshlq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5653,7 +5999,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqsubt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5670,7 +6017,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqsubt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqsubq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5687,7 +6035,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrhaddt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrhaddq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5704,7 +6053,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmulht.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmulhq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5721,7 +6071,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrshlt.%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshlq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5738,7 +6089,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrshrt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5755,7 +6107,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshlt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5772,7 +6125,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshrt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5789,7 +6143,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vslit.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsliq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5806,7 +6161,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vsubt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5823,7 +6179,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot270_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5840,7 +6197,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vhcaddt.s%#\t%q0, %q2, %q3, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vhcaddq_rot90_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5857,7 +6215,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmladavaxt.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmladavaxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5874,7 +6233,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsdavat.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5891,7 +6251,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsdavaxt.s%#\t%0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsdavaxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5908,7 +6269,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmladht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5925,7 +6287,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmladhxt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmladhxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5942,7 +6305,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmlsdht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5959,7 +6323,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmlsdhxt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmlsdhxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5976,7 +6341,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmulht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -5993,7 +6359,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmulht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6010,7 +6377,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmladht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6027,7 +6395,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmladhxt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmladhxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6044,7 +6413,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmlsdht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6061,7 +6431,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmlsdhxt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmlsdhxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6078,7 +6449,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmulht.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6095,7 +6467,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrdmulht.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrdmulhq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6112,7 +6485,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlaldavat.%# %Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaq_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6129,8 +6503,9 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlaldavaxt.%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlaldavaxq_")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [vqrshrnbq_m_n_u, vqrshrnbq_m_n_s]) @@ -6146,7 +6521,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshrnbt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrnbq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6163,7 +6539,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshrntt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrntq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6180,7 +6557,8 @@ ] "TARGET_HAVE_MVE" "vpst\n\tvqshrnbt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrnbq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6197,7 +6575,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshrntt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrntq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6214,7 +6593,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlaldavhat.s32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6231,7 +6611,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrshrnbt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrnbq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6248,7 +6629,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrshrntt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrshrntq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6265,7 +6647,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshllbt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshllbq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6282,7 +6665,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshlltt.%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlltq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6299,7 +6683,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshrnbt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrnbq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6316,7 +6701,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshrntt.i%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshrntq_n_")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6333,7 +6719,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsldavat.s%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6350,7 +6737,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmlsldavaxt.s%#\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmlsldavaxq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6367,7 +6755,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmullbt.p%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmullbq_poly_p")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6384,7 +6773,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vmulltt.p%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulltq_poly_p")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6401,7 +6791,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmullbt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6418,7 +6809,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmullbt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmullbq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6435,7 +6827,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmulltt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6452,7 +6845,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqdmulltt.s%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqdmulltq_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6469,7 +6863,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshrunbt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshrunbq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6486,7 +6881,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqrshruntt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqrshruntq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6503,7 +6899,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshrunbt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshrunbq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6520,7 +6917,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vqshruntt.s%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vqshruntq_n_s")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6537,7 +6935,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlaldavhat.u32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaq_uv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6554,7 +6953,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlaldavhaxt.s32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlaldavhaxq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6571,7 +6971,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlsldavhat.s32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6588,7 +6989,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vrmlsldavhaxt.s32\t%Q0, %R0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vrmlsldavhaxq_sv4si")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; ;; [vabdq_m_f]) @@ -6604,7 +7006,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vabdt.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vabdq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6621,7 +7024,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vaddt.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6638,7 +7042,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vaddt.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6655,7 +7060,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vandt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vandq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6672,7 +7078,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vbict %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbicq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6689,7 +7096,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vbrsrt.%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vbrsrq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6706,7 +7114,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcaddt.f%# %q0, %q2, %q3, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot270")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6723,7 +7132,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcaddt.f%# %q0, %q2, %q3, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcaddq_rot90")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6740,7 +7150,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmlat.f%# %q0, %q2, %q3, #0" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6757,7 +7168,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmlat.f%# %q0, %q2, %q3, #180" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot180")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6774,7 +7186,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmlat.f%# %q0, %q2, %q3, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot270")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6791,7 +7204,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmlat.f%# %q0, %q2, %q3, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmlaq_rot90")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6808,7 +7222,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmult.f%# %q0, %q2, %q3, #0" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6825,7 +7240,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmult.f%# %q0, %q2, %q3, #180" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot180")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6842,7 +7258,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmult.f%# %q0, %q2, %q3, #270" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot270")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6859,7 +7276,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vcmult.f%# %q0, %q2, %q3, #90" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vcmulq_rot90")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6876,7 +7294,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;veort %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_veorq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6893,7 +7312,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vfmat.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6910,7 +7330,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vfmat.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmaq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6927,7 +7348,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vfmast.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmasq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6944,7 +7366,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vfmst.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vfmsq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6961,7 +7384,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmaxnmt.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmaxnmq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6978,7 +7402,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vminnmt.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vminnmq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -6995,7 +7420,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmult.f%# %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7012,7 +7438,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vmult.f%# %q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vmulq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7029,7 +7456,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vornt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vornq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7046,7 +7474,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vorrt %q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vorrq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7063,7 +7492,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vsubt.f%#\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7080,7 +7510,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vsubt.f%#\t%q0, %q2, %3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsubq_n_f")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -7100,7 +7531,8 @@ output_asm_insn("vstrb.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) + (set_attr "length" "4")]) ;; ;; [vstrbq_scatter_offset_s vstrbq_scatter_offset_u] @@ -7128,7 +7560,8 @@ VSTRBSOQ))] "TARGET_HAVE_MVE" "vstrb.\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_s vstrwq_scatter_base_u] @@ -7150,7 +7583,8 @@ output_asm_insn("vstrw.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrbq_gather_offset_s vldrbq_gather_offset_u] @@ -7173,7 +7607,8 @@ output_asm_insn ("vldrb.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrbq_s vldrbq_u] @@ -7195,7 +7630,8 @@ output_asm_insn ("vldrb.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_base_s vldrwq_gather_base_u] @@ -7215,7 +7651,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) + (set_attr "length" "4")]) ;; ;; [vstrbq_scatter_offset_p_s vstrbq_scatter_offset_p_u] @@ -7247,7 +7684,8 @@ VSTRBSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrbt.\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_scatter_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_p_s vstrwq_scatter_base_p_u] @@ -7270,7 +7708,8 @@ output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrbq_p_s vstrbq_p_u] @@ -7290,7 +7729,8 @@ output_asm_insn ("vpst\;vstrbt.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrbq_")) + (set_attr "length" "8")]) ;; ;; [vldrbq_gather_offset_z_s vldrbq_gather_offset_z_u] @@ -7315,7 +7755,8 @@ output_asm_insn ("vpst\n\tvldrbt.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_gather_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrbq_z_s vldrbq_z_u] @@ -7338,7 +7779,8 @@ output_asm_insn ("vpst\;vldrbt.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrbq_")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_base_z_s vldrwq_gather_base_z_u] @@ -7359,7 +7801,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_v4si")) + (set_attr "length" "8")]) ;; ;; [vldrhq_f] @@ -7378,7 +7821,8 @@ output_asm_insn ("vldrh.16\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_s vldrhq_gather_offset_u] @@ -7401,7 +7845,8 @@ output_asm_insn ("vldrh.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_z_s vldrhq_gather_offset_z_u] @@ -7426,7 +7871,8 @@ output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_shifted_offset_s vldrhq_gather_shifted_offset_u] @@ -7449,7 +7895,8 @@ output_asm_insn ("vldrh.\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_shifted_offset_z_s vldrhq_gather_shited_offset_z_u] @@ -7474,7 +7921,8 @@ output_asm_insn ("vpst\n\tvldrht.\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_")) + (set_attr "length" "8")]) ;; ;; [vldrhq_s, vldrhq_u] @@ -7496,7 +7944,8 @@ output_asm_insn ("vldrh.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) + (set_attr "length" "4")]) ;; ;; [vldrhq_z_f] @@ -7516,7 +7965,8 @@ output_asm_insn ("vpst\;vldrht.16\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrhq_z_s vldrhq_z_u] @@ -7539,7 +7989,8 @@ output_asm_insn ("vpst\;vldrht.\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_")) + (set_attr "length" "8")]) ;; ;; [vldrwq_f] @@ -7558,7 +8009,8 @@ output_asm_insn ("vldrw.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_s vldrwq_u] @@ -7577,7 +8029,8 @@ output_asm_insn ("vldrw.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_z_f] @@ -7597,7 +8050,8 @@ output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_z_s vldrwq_z_u] @@ -7617,7 +8071,8 @@ output_asm_insn ("vpst\;vldrwt.32\t%q0, %E1",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_v4si")) + (set_attr "length" "8")]) (define_expand "mve_vld1q_f" [(match_operand:MVE_0 0 "s_register_operand") @@ -7657,7 +8112,8 @@ output_asm_insn ("vldrd.64\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_base_z_s vldrdq_gather_base_z_u] @@ -7678,7 +8134,8 @@ output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrdq_gather_offset_s vldrdq_gather_offset_u] @@ -7698,7 +8155,8 @@ output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_offset_z_s vldrdq_gather_offset_z_u] @@ -7719,7 +8177,8 @@ output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_offset_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrdq_gather_shifted_offset_s vldrdq_gather_shifted_offset_u] @@ -7739,7 +8198,8 @@ output_asm_insn ("vldrd.u64\t%q0, [%m1, %q2, uxtw #3]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) + (set_attr "length" "4")]) ;; ;; [vldrdq_gather_shifted_offset_z_s vldrdq_gather_shifted_offset_z_u] @@ -7760,7 +8220,8 @@ output_asm_insn ("vpst\n\tvldrdt.u64\t%q0, [%m1, %q2, uxtw #3]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_shifted_offset_v2di")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_offset_f] @@ -7780,7 +8241,8 @@ output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_offset_z_f] @@ -7802,7 +8264,8 @@ output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_offset_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrhq_gather_shifted_offset_f] @@ -7822,7 +8285,8 @@ output_asm_insn ("vldrh.f16\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vldrhq_gather_shifted_offset_z_f] @@ -7844,7 +8308,8 @@ output_asm_insn ("vpst\n\tvldrht.f16\t%q0, [%m1, %q2, uxtw #1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrhq_gather_shifted_offset_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_base_f] @@ -7864,7 +8329,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_base_z_f] @@ -7885,7 +8351,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%q1, %2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_offset_f] @@ -7905,7 +8372,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_offset_s vldrwq_gather_offset_u] @@ -7925,7 +8393,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_offset_z_f] @@ -7947,7 +8416,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_offset_z_s vldrwq_gather_offset_z_u] @@ -7969,7 +8439,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_offset_v4si")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_shifted_offset_f] @@ -7989,7 +8460,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_shifted_offset_s vldrwq_gather_shifted_offset_u] @@ -8009,7 +8481,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) + (set_attr "length" "4")]) ;; ;; [vldrwq_gather_shifted_offset_z_f] @@ -8031,7 +8504,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vldrwq_gather_shifted_offset_z_s vldrwq_gather_shifted_offset_z_u] @@ -8053,7 +8527,8 @@ output_asm_insn ("vpst\n\tvldrwt.u32\t%q0, [%m1, %q2, uxtw #2]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_shifted_offset_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrhq_f] @@ -8072,7 +8547,8 @@ output_asm_insn ("vstrh.16\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) + (set_attr "length" "4")]) ;; ;; [vstrhq_p_f] @@ -8092,7 +8568,8 @@ output_asm_insn ("vpst\;vstrht.16\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_fv8hf")) + (set_attr "length" "8")]) ;; ;; [vstrhq_p_s vstrhq_p_u] @@ -8112,7 +8589,8 @@ output_asm_insn ("vpst\;vstrht.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_offset_p_s vstrhq_scatter_offset_p_u] @@ -8144,7 +8622,8 @@ VSTRHSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrht.\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_offset_s vstrhq_scatter_offset_u] @@ -8172,7 +8651,8 @@ VSTRHSOQ))] "TARGET_HAVE_MVE" "vstrh.\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_shifted_offset_p_s vstrhq_scatter_shifted_offset_p_u] @@ -8204,7 +8684,8 @@ VSTRHSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrht.\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_shifted_offset_s vstrhq_scatter_shifted_offset_u] @@ -8233,7 +8714,8 @@ VSTRHSSOQ))] "TARGET_HAVE_MVE" "vstrh.\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset__insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_s, vstrhq_u] @@ -8252,7 +8734,8 @@ output_asm_insn ("vstrh.\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_")) + (set_attr "length" "4")]) ;; ;; [vstrwq_f] @@ -8271,7 +8754,8 @@ output_asm_insn ("vstrw.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_p_f] @@ -8291,7 +8775,8 @@ output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrwq_p_s vstrwq_p_u] @@ -8311,7 +8796,8 @@ output_asm_insn ("vpst\;vstrwt.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrwq_s vstrwq_u] @@ -8330,7 +8816,8 @@ output_asm_insn ("vstrw.32\t%q1, %E0",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_v4si")) + (set_attr "length" "4")]) (define_expand "mve_vst1q_f" [(match_operand: 0 "mve_memory_operand") @@ -8373,7 +8860,8 @@ output_asm_insn ("vpst\;\tvstrdt.u64\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_base_s vstrdq_scatter_base_u] @@ -8395,7 +8883,8 @@ output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_v2di")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_offset_p_s vstrdq_scatter_offset_p_u] @@ -8426,7 +8915,8 @@ VSTRDSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrdt.64\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_offset_s vstrdq_scatter_offset_u] @@ -8454,7 +8944,8 @@ VSTRDSOQ))] "TARGET_HAVE_MVE" "vstrd.64\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_offset_v2di_insn")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_shifted_offset_p_s vstrdq_scatter_shifted_offset_p_u] @@ -8486,7 +8977,8 @@ VSTRDSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrdt.64\t%q2, [%0, %q1, UXTW #3]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_shifted_offset_s vstrdq_scatter_shifted_offset_u] @@ -8515,7 +9007,8 @@ VSTRDSSOQ))] "TARGET_HAVE_MVE" "vstrd.64\t%q2, [%0, %q1, UXTW #3]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_shifted_offset_v2di_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_offset_f] @@ -8543,7 +9036,8 @@ VSTRHQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrh.16\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_offset_p_f] @@ -8574,7 +9068,8 @@ VSTRHQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrht.16\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_offset_fv8hf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrhq_scatter_shifted_offset_f] @@ -8602,7 +9097,8 @@ VSTRHQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrh.16\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrhq_scatter_shifted_offset_p_f] @@ -8634,7 +9130,8 @@ VSTRHQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrht.16\t%q2, [%0, %q1, uxtw #1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrhq_scatter_shifted_offset_fv8hf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_f] @@ -8656,7 +9153,8 @@ output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_p_f] @@ -8679,7 +9177,8 @@ output_asm_insn ("vpst\n\tvstrwt.u32\t%q2, [%q0, %1]",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_f] @@ -8707,7 +9206,8 @@ VSTRWQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrw.32\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_offset_p_f] @@ -8738,7 +9238,8 @@ VSTRWQSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrwt.32\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8769,7 +9270,8 @@ VSTRWSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrwt.32\t%q2, [%0, %q1]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_offset_s vstrwq_scatter_offset_u] @@ -8797,7 +9299,8 @@ VSTRWSOQ))] "TARGET_HAVE_MVE" "vstrw.32\t%q2, [%0, %q1]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_offset_v4si_insn")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_shifted_offset_f] @@ -8825,7 +9328,8 @@ VSTRWQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vstrw.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_p_f] @@ -8857,7 +9361,8 @@ VSTRWQSSO_F))] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_fv4sf_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_p_s vstrwq_scatter_shifted_offset_p_u] @@ -8889,7 +9394,8 @@ VSTRWSSOQ))] "TARGET_HAVE_MVE" "vpst\;vstrwt.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_shifted_offset_s vstrwq_scatter_shifted_offset_u] @@ -8918,7 +9424,8 @@ VSTRWSSOQ))] "TARGET_HAVE_MVE" "vstrw.32\t%q2, [%0, %q1, uxtw #2]" - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_shifted_offset_v4si_insn")) + (set_attr "length" "4")]) ;; ;; [vaddq_s, vaddq_u]) @@ -8931,7 +9438,8 @@ ] "TARGET_HAVE_MVE" "vadd.i%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq")) + (set_attr "type" "mve_move") ]) ;; @@ -8945,7 +9453,8 @@ ] "TARGET_HAVE_MVE && TARGET_HAVE_MVE_FLOAT" "vadd.f%#\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vaddq_f")) + (set_attr "type" "mve_move") ]) ;; @@ -9014,7 +9523,8 @@ (match_operand:SI 6 "immediate_operand" "i")))] "TARGET_HAVE_MVE" "vpst\;\tvidupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vidupq_u_insn")) + (set_attr "length""8")]) ;; ;; [vddupq_n_u]) @@ -9082,7 +9592,8 @@ (match_operand:SI 6 "immediate_operand" "i")))] "TARGET_HAVE_MVE" "vpst\;vddupt.u%#\t%q0, %2, %4" - [(set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vddupq_u_insn")) + (set_attr "length""8")]) ;; ;; [vdwdupq_n_u]) @@ -9198,8 +9709,9 @@ ] "TARGET_HAVE_MVE" "vpst\;vdwdupt.u%#\t%q2, %3, %R4, %5" - [(set_attr "type" "mve_move") - (set_attr "length""8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vdwdupq_wb_u_insn")) + (set_attr "type" "mve_move") + (set_attr "length""8")]) ;; ;; [viwdupq_n_u]) @@ -9315,7 +9827,8 @@ ] "TARGET_HAVE_MVE" "vpst\;\tviwdupt.u%#\t%q2, %3, %R4, %5" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_viwdupq_wb_u_insn")) + (set_attr "type" "mve_move") (set_attr "length""8")]) ;; @@ -9341,7 +9854,8 @@ output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_wb_p_s vstrwq_scatter_base_wb_p_u] @@ -9367,7 +9881,8 @@ output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_v4si")) + (set_attr "length" "8")]) ;; ;; [vstrwq_scatter_base_wb_f] @@ -9392,7 +9907,8 @@ output_asm_insn ("vstrw.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) + (set_attr "length" "4")]) ;; ;; [vstrwq_scatter_base_wb_p_f] @@ -9418,7 +9934,8 @@ output_asm_insn ("vpst\;\tvstrwt.u32\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrwq_scatter_base_wb_fv4sf")) + (set_attr "length" "8")]) ;; ;; [vstrdq_scatter_base_wb_s vstrdq_scatter_base_wb_u] @@ -9443,7 +9960,8 @@ output_asm_insn ("vstrd.u64\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) + (set_attr "length" "4")]) ;; ;; [vstrdq_scatter_base_wb_p_s vstrdq_scatter_base_wb_p_u] @@ -9469,7 +9987,8 @@ output_asm_insn ("vpst;vstrdt.u64\t%q2, [%q0, %1]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vstrdq_scatter_base_wb_v2di")) + (set_attr "length" "8")]) (define_expand "mve_vldrwq_gather_base_wb_v4si" [(match_operand:V4SI 0 "s_register_operand") @@ -9521,7 +10040,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrwq_gather_base_wb_z_v4si" [(match_operand:V4SI 0 "s_register_operand") @@ -9577,7 +10097,8 @@ output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_v4si_insn")) + (set_attr "length" "8")]) (define_expand "mve_vldrwq_gather_base_wb_fv4sf" [(match_operand:V4SI 0 "s_register_operand") @@ -9629,7 +10150,8 @@ output_asm_insn ("vldrw.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrwq_gather_base_wb_z_fv4sf" [(match_operand:V4SI 0 "s_register_operand") @@ -9686,7 +10208,8 @@ output_asm_insn ("vpst\;vldrwt.u32\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrwq_gather_base_wb_fv4sf_insn")) + (set_attr "length" "8")]) (define_expand "mve_vldrdq_gather_base_wb_v2di" [(match_operand:V2DI 0 "s_register_operand") @@ -9739,7 +10262,8 @@ output_asm_insn ("vldrd.64\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "4")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) + (set_attr "length" "4")]) (define_expand "mve_vldrdq_gather_base_wb_z_v2di" [(match_operand:V2DI 0 "s_register_operand") @@ -9778,7 +10302,7 @@ (unspec:SI [(reg:SI VFPCC_REGNUM)] UNSPEC_GET_FPSCR_NZCVQC))] "TARGET_HAVE_MVE" "vmrs\\t%0, FPSCR_nzcvqc" - [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")]) (define_insn "set_fpscr_nzcvqc" [(set (reg:SI VFPCC_REGNUM) @@ -9786,7 +10310,7 @@ VUNSPEC_SET_FPSCR_NZCVQC))] "TARGET_HAVE_MVE" "vmsr\\tFPSCR_nzcvqc, %0" - [(set_attr "type" "mve_move")]) + [(set_attr "type" "mve_move")]) ;; ;; [vldrdq_gather_base_wb_z_s vldrdq_gather_base_wb_z_u] @@ -9811,7 +10335,8 @@ output_asm_insn ("vpst\;vldrdt.u64\t%q0, [%q1, %2]!",ops); return ""; } - [(set_attr "length" "8")]) + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vldrdq_gather_base_wb_v2di_insn")) + (set_attr "length" "8")]) ;; ;; [vadciq_m_s, vadciq_m_u]) ;; @@ -9828,7 +10353,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vadcit.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -9845,7 +10371,8 @@ ] "TARGET_HAVE_MVE" "vadci.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -9864,7 +10391,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vadct.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -9881,7 +10409,8 @@ ] "TARGET_HAVE_MVE" "vadc.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vadcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4") (set_attr "conds" "set")]) @@ -9901,7 +10430,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vsbcit.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -9918,7 +10448,8 @@ ] "TARGET_HAVE_MVE" "vsbci.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbciq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -9937,7 +10468,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vsbct.i32\t%q0, %q2, %q3" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; @@ -9954,7 +10486,8 @@ ] "TARGET_HAVE_MVE" "vsbc.i32\t%q0, %q1, %q2" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vsbcq_v4si")) + (set_attr "type" "mve_move") (set_attr "length" "4")]) ;; @@ -9983,7 +10516,7 @@ "vst21.\t{%q0, %q1}, %3", ops); return ""; } - [(set_attr "length" "8")]) + [(set_attr "length" "8")]) ;; ;; [vld2q]) @@ -10011,7 +10544,7 @@ "vld21.\t{%q0, %q1}, %3", ops); return ""; } - [(set_attr "length" "8")]) + [(set_attr "length" "8")]) ;; ;; [vld4q]) @@ -10354,7 +10887,8 @@ ] "TARGET_HAVE_MVE" "vpst\;vshlct\t%q0, %1, %4" - [(set_attr "type" "mve_move") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlcq_")) + (set_attr "type" "mve_move") (set_attr "length" "8")]) ;; CDE instructions on MVE registers. @@ -10366,7 +10900,8 @@ UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx1\\tp%c1, %q0, #%c2" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx1qav16qi" @@ -10377,7 +10912,8 @@ UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx1a\\tp%c1, %q0, #%c3" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx2qv16qi" @@ -10388,7 +10924,8 @@ UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx2\\tp%c1, %q0, %q2, #%c3" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx2qav16qi" @@ -10400,7 +10937,8 @@ UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx2a\\tp%c1, %q0, %q3, #%c4" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx3qv16qi" @@ -10412,7 +10950,8 @@ UNSPEC_VCDE))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx3\\tp%c1, %q0, %q2, %q3, #%c4" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx3qav16qi" @@ -10425,7 +10964,8 @@ UNSPEC_VCDEA))] "TARGET_CDE && TARGET_HAVE_MVE" "vcx3a\\tp%c1, %q0, %q3, %q4, #%c5" - [(set_attr "type" "coproc")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qav16qi")) + (set_attr "type" "coproc")] ) (define_insn "arm_vcx1q_p_v16qi" @@ -10437,7 +10977,8 @@ CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx1t\\tp%c1, %q0, #%c3" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx1qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) @@ -10451,7 +10992,8 @@ CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx2t\\tp%c1, %q0, %q3, #%c4" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx2qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) @@ -10466,7 +11008,8 @@ CDE_VCX))] "TARGET_CDE && TARGET_HAVE_MVE" "vpst\;vcx3t\\tp%c1, %q0, %q3, %q4, #%c5" - [(set_attr "type" "coproc") + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_arm_vcx3qv16qi")) + (set_attr "type" "coproc") (set_attr "length" "8")] ) diff --git a/gcc/config/arm/vec-common.md b/gcc/config/arm/vec-common.md index 1fd68f3ac43c64bc980cb59a6bf38e7db2c78be2..9cd2a91b1ed354f101b8c80e3252a63f5b77e73c 100644 --- a/gcc/config/arm/vec-common.md +++ b/gcc/config/arm/vec-common.md @@ -366,7 +366,8 @@ "@ vshl.%#\t%0, %1, %2 * return neon_output_shift_immediate (\"vshl\", 'i', &operands[2], mode, VALID_NEON_QREG_MODE (mode), true);" - [(set_attr "type" "neon_shift_reg, neon_shift_imm")] + [(set (attr "mve_unpredicated_insn") (symbol_ref "CODE_FOR_mve_vshlq_")) + (set_attr "type" "neon_shift_reg, neon_shift_imm")] ) (define_expand "vashl3" diff --git a/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c new file mode 100644 index 0000000000000000000000000000000000000000..acf0836050c19b983feeaf97c3e52e1318bb194d --- /dev/null +++ b/gcc/testsuite/gcc.target/arm/dlstp-compile-asm.c @@ -0,0 +1,149 @@ +/* { dg-do compile { target { arm*-*-* } } } */ +/* { dg-require-effective-target arm_v8_1m_mve_ok } */ +/* { dg-skip-if "avoid conflicting multilib options" { *-*-* } { "-marm" "-mcpu=*" } } */ +/* { dg-options "-march=armv8.1-m.main+fp.dp+mve.fp -mfloat-abi=hard -mfpu=auto -O3" } */ + +#include + +#define IMM 5 + +#define TEST_COMPILE_IN_DLSTP_TERNARY(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED) \ +void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *b, TYPE##BITS##_t *c, int n) \ +{ \ + while (n > 0) \ + { \ + mve_pred16_t p = vctp##BITS##q (n); \ + TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p); \ + TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p); \ + TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (va, vb, p); \ + vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p); \ + c += LANES; \ + a += LANES; \ + b += LANES; \ + n -= LANES; \ + } \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY(BITS, LANES, LDRSTRYTPE, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY(NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (8, 16, b, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (16, 8, h, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY (32, 4, w, NAME, PRED) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vmulq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vsubq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vhaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY (vorrq, _x) + + +#define TEST_COMPILE_IN_DLSTP_TERNARY_M(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED) \ +void test_##NAME##PRED##_##SIGN##BITS (TYPE##BITS##x##LANES##_t __inactive, TYPE##BITS##_t *a, TYPE##BITS##_t *b, TYPE##BITS##_t *c, int n) \ +{ \ + while (n > 0) \ + { \ + mve_pred16_t p = vctp##BITS##q (n); \ + TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p); \ + TYPE##BITS##x##LANES##_t vb = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (b, p); \ + TYPE##BITS##x##LANES##_t vc = NAME##PRED##_##SIGN##BITS (__inactive, va, vb, p); \ + vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p); \ + c += LANES; \ + a += LANES; \ + b += LANES; \ + n -= LANES; \ + } \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M(BITS, LANES, LDRSTRYTPE, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_M (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_M (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M(NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (8, 16, b, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (16, 8, h, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M (32, 4, w, NAME, PRED) + + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vmulq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vsubq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vhaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M (vorrq, _m) + +#define TEST_COMPILE_IN_DLSTP_TERNARY_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED) \ +void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##_t *a, TYPE##BITS##_t *c, int n) \ +{ \ + while (n > 0) \ + { \ + mve_pred16_t p = vctp##BITS##q (n); \ + TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p); \ + TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (va, IMM, p); \ + vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p); \ + c += LANES; \ + a += LANES; \ + n -= LANES; \ + } \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N(BITS, LANES, LDRSTRYTPE, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N(NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (8, 16, b, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (16, 8, h, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_N (32, 4, w, NAME, PRED) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vaddq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vmulq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vsubq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vhaddq, _x) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vbrsrq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshlq, _x) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_N (vshrq, _x) + +#define TEST_COMPILE_IN_DLSTP_TERNARY_M_N(BITS, LANES, LDRSTRYTPE, TYPE, SIGN, NAME, PRED) \ +void test_##NAME##PRED##_n_##SIGN##BITS (TYPE##BITS##x##LANES##_t __inactive, TYPE##BITS##_t *a, TYPE##BITS##_t *c, int n) \ +{ \ + while (n > 0) \ + { \ + mve_pred16_t p = vctp##BITS##q (n); \ + TYPE##BITS##x##LANES##_t va = vldr##LDRSTRYTPE##q_z_##SIGN##BITS (a, p); \ + TYPE##BITS##x##LANES##_t vc = NAME##PRED##_n_##SIGN##BITS (__inactive, va, IMM, p); \ + vstr##LDRSTRYTPE##q_p_##SIGN##BITS (c, vc, p); \ + c += LANES; \ + a += LANES; \ + n -= LANES; \ + } \ +} + +#define TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N(BITS, LANES, LDRSTRYTPE, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_M_N (BITS, LANES, LDRSTRYTPE, int, s, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_TERNARY_M_N (BITS, LANES, LDRSTRYTPE, uint, u, NAME, PRED) + +#define TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N(NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (8, 16, b, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (16, 8, h, NAME, PRED) \ +TEST_COMPILE_IN_DLSTP_SIGNED_UNSIGNED_TERNARY_M_N (32, 4, w, NAME, PRED) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vaddq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vmulq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vsubq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vhaddq, _m) + +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vbrsrq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vshlq, _m) +TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY_M_N (vshrq, _m) + +/* The final number of DLSTPs currently is calculated by the number of + `TEST_COMPILE_IN_DLSTP_INTBITS_SIGNED_UNSIGNED_TERNARY.*` macros * 6. */ +/* { dg-final { scan-assembler-times {\tdlstp} 144 } } */ +/* { dg-final { scan-assembler-times {\tletp} 144 } } */ +/* { dg-final { scan-assembler-not "\tvctp\t" } } */ +/* { dg-final { scan-assembler-not "\tvpst\t" } } */ +/* { dg-final { scan-assembler-not "p0" } } */