pax_global_header00006660000000000000000000000064150337065000014511gustar00rootroot0000000000000052 comment=b05629a2e46f919009b9d55dcaeb74fa7ae1d211 hare-update-0.25.2.0/000077500000000000000000000000001503370650000141345ustar00rootroot00000000000000hare-update-0.25.2.0/.gitignore000066400000000000000000000000431503370650000161210ustar00rootroot00000000000000/hare-update /hare-update-genrules hare-update-0.25.2.0/COPYING000066400000000000000000000330041503370650000151670ustar00rootroot00000000000000 EUROPEAN UNION PUBLIC LICENCE v. 1.2 EUPL © the European Union 2007, 2016 This European Union Public Licence (the ‘EUPL’) applies to the Work (as defined below) which is provided under the terms of this Licence. Any use of the Work, other than as authorised under this Licence is prohibited (to the extent such use is covered by a right of the copyright holder of the Work). The Work is provided under the terms of this Licence when the Licensor (as defined below) has placed the following notice immediately following the copyright notice for the Work: Licensed under the EUPL or has expressed by any other means his willingness to license under the EUPL. 1. Definitions In this Licence, the following terms have the following meaning: - ‘The Licence’: this Licence. - ‘The Original Work’: the work or software distributed or communicated by the Licensor under this Licence, available as Source Code and also as Executable Code as the case may be. - ‘Derivative Works’: the works or software that could be created by the Licensee, based upon the Original Work or modifications thereof. This Licence does not define the extent of modification or dependence on the Original Work required in order to classify a work as a Derivative Work; this extent is determined by copyright law applicable in the country mentioned in Article 15. - ‘The Work’: the Original Work or its Derivative Works. - ‘The Source Code’: the human-readable form of the Work which is the most convenient for people to study and modify. - ‘The Executable Code’: any code which has generally been compiled and which is meant to be interpreted by a computer as a program. - ‘The Licensor’: the natural or legal person that distributes or communicates the Work under the Licence. - ‘Contributor(s)’: any natural or legal person who modifies the Work under the Licence, or otherwise contributes to the creation of a Derivative Work. - ‘The Licensee’ or ‘You’: any natural or legal person who makes any usage of the Work under the terms of the Licence. - ‘Distribution’ or ‘Communication’: any act of selling, giving, lending, renting, distributing, communicating, transmitting, or otherwise making available, online or offline, copies of the Work or providing access to its essential functionalities at the disposal of any other natural or legal person. 2. Scope of the rights granted by the Licence The Licensor hereby grants You a worldwide, royalty-free, non-exclusive, sublicensable licence to do the following, for the duration of copyright vested in the Original Work: - use the Work in any circumstance and for all usage, - reproduce the Work, - modify the Work, and make Derivative Works based upon the Work, - communicate to the public, including the right to make available or display the Work or copies thereof to the public and perform publicly, as the case may be, the Work, - distribute the Work or copies thereof, - lend and rent the Work or copies thereof, - sublicense rights in the Work or copies thereof. Those rights can be exercised on any media, supports and formats, whether now known or later invented, as far as the applicable law permits so. In the countries where moral rights apply, the Licensor waives his right to exercise his moral right to the extent allowed by law in order to make effective the licence of the economic rights here above listed. The Licensor grants to the Licensee royalty-free, non-exclusive usage rights to any patents held by the Licensor, to the extent necessary to make use of the rights granted on the Work under this Licence. 3. Communication of the Source Code The Licensor may provide the Work either in its Source Code form, or as Executable Code. If the Work is provided as Executable Code, the Licensor provides in addition a machine-readable copy of the Source Code of the Work along with each copy of the Work that the Licensor distributes or indicates, in a notice following the copyright notice attached to the Work, a repository where the Source Code is easily and freely accessible for as long as the Licensor continues to distribute or communicate the Work. 4. Limitations on copyright Nothing in this Licence is intended to deprive the Licensee of the benefits from any exception or limitation to the exclusive rights of the rights owners in the Work, of the exhaustion of those rights or of other applicable limitations thereto. 5. Obligations of the Licensee The grant of the rights mentioned above is subject to some restrictions and obligations imposed on the Licensee. Those obligations are the following: Attribution right: The Licensee shall keep intact all copyright, patent or trademarks notices and all notices that refer to the Licence and to the disclaimer of warranties. The Licensee must include a copy of such notices and a copy of the Licence with every copy of the Work he/she distributes or communicates. The Licensee must cause any Derivative Work to carry prominent notices stating that the Work has been modified and the date of modification. Copyleft clause: If the Licensee distributes or communicates copies of the Original Works or Derivative Works, this Distribution or Communication will be done under the terms of this Licence or of a later version of this Licence unless the Original Work is expressly distributed only under this version of the Licence — for example by communicating ‘EUPL v. 1.2 only’. The Licensee (becoming Licensor) cannot offer or impose any additional terms or conditions on the Work or Derivative Work that alter or restrict the terms of the Licence. Compatibility clause: If the Licensee Distributes or Communicates Derivative Works or copies thereof based upon both the Work and another work licensed under a Compatible Licence, this Distribution or Communication can be done under the terms of this Compatible Licence. For the sake of this clause, ‘Compatible Licence’ refers to the licences listed in the appendix attached to this Licence. Should the Licensee's obligations under the Compatible Licence conflict with his/her obligations under this Licence, the obligations of the Compatible Licence shall prevail. Provision of Source Code: When distributing or communicating copies of the Work, the Licensee will provide a machine-readable copy of the Source Code or indicate a repository where this Source will be easily and freely available for as long as the Licensee continues to distribute or communicate the Work. Legal Protection: This Licence does not grant permission to use the trade names, trademarks, service marks, or names of the Licensor, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the copyright notice. 6. Chain of Authorship The original Licensor warrants that the copyright in the Original Work granted hereunder is owned by him/her or licensed to him/her and that he/she has the power and authority to grant the Licence. Each Contributor warrants that the copyright in the modifications he/she brings to the Work are owned by him/her or licensed to him/her and that he/she has the power and authority to grant the Licence. Each time You accept the Licence, the original Licensor and subsequent Contributors grant You a licence to their contributions to the Work, under the terms of this Licence. 7. Disclaimer of Warranty The Work is a work in progress, which is continuously improved by numerous Contributors. It is not a finished work and may therefore contain defects or ‘bugs’ inherent to this type of development. For the above reason, the Work is provided under the Licence on an ‘as is’ basis and without warranties of any kind concerning the Work, including without limitation merchantability, fitness for a particular purpose, absence of defects or errors, accuracy, non-infringement of intellectual property rights other than copyright as stated in Article 6 of this Licence. This disclaimer of warranty is an essential part of the Licence and a condition for the grant of any rights to the Work. 8. Disclaimer of Liability Except in the cases of wilful misconduct or damages directly caused to natural persons, the Licensor will in no event be liable for any direct or indirect, material or moral, damages of any kind, arising out of the Licence or of the use of the Work, including without limitation, damages for loss of goodwill, work stoppage, computer failure or malfunction, loss of data or any commercial damage, even if the Licensor has been advised of the possibility of such damage. However, the Licensor will be liable under statutory product liability laws as far such laws apply to the Work. 9. Additional agreements While distributing the Work, You may choose to conclude an additional agreement, defining obligations or services consistent with this Licence. However, if accepting obligations, You may act only on your own behalf and on your sole responsibility, not on behalf of the original Licensor or any other Contributor, and only if You agree to indemnify, defend, and hold each Contributor harmless for any liability incurred by, or claims asserted against such Contributor by the fact You have accepted any warranty or additional liability. 10. Acceptance of the Licence The provisions of this Licence can be accepted by clicking on an icon ‘I agree’ placed under the bottom of a window displaying the text of this Licence or by affirming consent in any other similar way, in accordance with the rules of applicable law. Clicking on that icon indicates your clear and irrevocable acceptance of this Licence and all of its terms and conditions. Similarly, you irrevocably accept this Licence and all of its terms and conditions by exercising any rights granted to You by Article 2 of this Licence, such as the use of the Work, the creation by You of a Derivative Work or the Distribution or Communication by You of the Work or copies thereof. 11. Information to the public In case of any Distribution or Communication of the Work by means of electronic communication by You (for example, by offering to download the Work from a remote location) the distribution channel or media (for example, a website) must at least provide to the public the information requested by the applicable law regarding the Licensor, the Licence and the way it may be accessible, concluded, stored and reproduced by the Licensee. 12. Termination of the Licence The Licence and the rights granted hereunder will terminate automatically upon any breach by the Licensee of the terms of the Licence. Such a termination will not terminate the licences of any person who has received the Work from the Licensee under the Licence, provided such persons remain in full compliance with the Licence. 13. Miscellaneous Without prejudice of Article 9 above, the Licence represents the complete agreement between the Parties as to the Work. If any provision of the Licence is invalid or unenforceable under applicable law, this will not affect the validity or enforceability of the Licence as a whole. Such provision will be construed or reformed so as necessary to make it valid and enforceable. The European Commission may publish other linguistic versions or new versions of this Licence or updated versions of the Appendix, so far this is required and reasonable, without reducing the scope of the rights granted by the Licence. New versions of the Licence will be published with a unique version number. All linguistic versions of this Licence, approved by the European Commission, have identical value. Parties can take advantage of the linguistic version of their choice. 14. Jurisdiction Without prejudice to specific agreement between parties, - any litigation resulting from the interpretation of this License, arising between the European Union institutions, bodies, offices or agencies, as a Licensor, and any Licensee, will be subject to the jurisdiction of the Court of Justice of the European Union, as laid down in article 272 of the Treaty on the Functioning of the European Union, - any litigation arising between other parties and resulting from the interpretation of this License, will be subject to the exclusive jurisdiction of the competent court where the Licensor resides or conducts its primary business. 15. Applicable Law Without prejudice to specific agreement between parties, - this Licence shall be governed by the law of the European Union Member State where the Licensor has his seat, resides or has his registered office, - this licence shall be governed by Belgian law if the Licensor has no seat, residence or registered office inside a European Union Member State. Appendix ‘Compatible Licences’ according to Article 5 EUPL are: - GNU General Public License (GPL) v. 2, v. 3 - GNU Affero General Public License (AGPL) v. 3 - Open Software License (OSL) v. 2.1, v. 3.0 - Eclipse Public License (EPL) v. 1.0 - CeCILL v. 2.0, v. 2.1 - Mozilla Public Licence (MPL) v. 2 - GNU Lesser General Public Licence (LGPL) v. 2.1, v. 3 - Creative Commons Attribution-ShareAlike v. 3.0 Unported (CC BY-SA 3.0) for works other than software - European Union Public Licence (EUPL) v. 1.1, v. 1.2 - Québec Free and Open-Source Licence — Reciprocity (LiLiQ-R) or Strong Reciprocity (LiLiQ-R+). The European Commission may update this Appendix to later versions of the above licences without producing a new version of the EUPL, as long as they provide the rights granted in Article 2 of this Licence and protect the covered Source Code from exclusive appropriation. All other changes or additions to this Appendix require the production of a new EUPL version. hare-update-0.25.2.0/DESIGN.md000066400000000000000000000012421503370650000154260ustar00rootroot00000000000000# hare::lex, ::parse, and ::ast Under each v* directory (e.g. v0_24_2, vNEXT, etc) is a vendored fork of hare::ast, hare::lex, and hare::parse from the Hare standard library that corresponds to the appropriate release. These modules have been modified in two respects: 1. They have been updated to handle Hare source files both from the version in question and from one version prior. 2. Instrumentation has been installed throughout to allow hare-update to hook into the parser at various stages. A fork of the Hare master branch is maintained in the vNEXT module, and when a new Hare release ships this module is copied to v0_X_Y (as appropriate) for posterity. hare-update-0.25.2.0/Makefile000066400000000000000000000014071503370650000155760ustar00rootroot00000000000000.POSIX: .SUFFIXES: HARE=hare HAREFLAGS= DESTDIR= PREFIX=/usr/local LIBEXECDIR=$(PREFIX)/libexec HARE_SOURCES != find . -name '*.ha' | grep -v '^./versions/.*/v.*\.ha' all: hare-update hare-update-genrules VERSIONS=\ versions/v0_25_2/v0.25.2.ha versions/v0_25_2/v0.25.2.ha: versions/v0_25_2/v0.25.2.ha.in hare-update-genrules ./hare-update-genrules < $< > $@ hare-update: $(HARE_SOURCES) $(VERSIONS) $(HARE) build $(HAREFLAGS) -o $@ cmd/$@/ hare-update-genrules: $(HARE_SOURCES) $(HARE) build $(HAREFLAGS) -o $@ cmd/$@/ check: $(HARE) test $(HAREFLAGS) clean: rm -f hare-update install: install -Dm755 hare-update $(DESTDIR)$(LIBEXECDIR)/hare/hare-update uninstall: rm -f $(DESTDIR)$(LIBEXECDIR)/hare/hare-update .PHONY: all check clean install uninstall hare-update-0.25.2.0/README.md000066400000000000000000000014341503370650000154150ustar00rootroot00000000000000# hare-update hare-update is a Hare add-on which assists in migrating a Hare codebases to a newer release of Hare by scanning your code, identifying areas impacted by breaking changes, and suggesting the appropriate fix. **RFC**: This tool is a work-in-progress. Feedback (and patches) welcome. ## Installation ``` $ make # make install $ hare tool update -h ``` ## Usage Run `hare tool update` to apply upgrades from Hare version N-1 to version N, where N is the latest stable release of Hare. To upgrade to a different release, use `hare tool update -l` to list supported releases. If building from the development branch (rather than a tagged release of hare-update), by default hare-update will update your codebase to support the development branch of Hare upstream (i.e. "v0.next"). hare-update-0.25.2.0/cmd/000077500000000000000000000000001503370650000146775ustar00rootroot00000000000000hare-update-0.25.2.0/cmd/hare-update-genrules/000077500000000000000000000000001503370650000207205ustar00rootroot00000000000000hare-update-0.25.2.0/cmd/hare-update-genrules/main.ha000066400000000000000000000025271503370650000221640ustar00rootroot00000000000000use bufio; use common; use fmt; use fs; use getopt; use io; use os; use path; use v0_25_2::lex; use v0_25_2::unparse; const help: [_]getopt::help = [ "generate hare-update rules from stdin and write to stdout", ]; export fn main() void = { const cmd = getopt::parse(os::args, help...); defer getopt::finish(&cmd); if (len(cmd.args) != 0) { getopt::printusage(os::stderr, "hare-update-genrules", help)!; os::exit(os::status::FAILURE); }; fmt::print(`// Code generated by hare-update-genrules // Do not edit by hand! use common::{ltok, nonterminal}; use glue; use io; use rules; use rules::{getvar}; `)!; const in = bufio::newscanner(os::stdin); defer bufio::finish(&in); const lex = lex::init(&in, ""); let ver = version { ... }; defer version_finish(&ver); match (scan(&ver, os::stdout, &lex)) { case let err: common::error => fmt::fatal(common::strerror(err)); case void => void; }; fmt::println()!; fmt::printf(` export const {} = rules::version {{ name = "{}", `, ver.symbol, ver.name)!; for (let prop .. ver.props) { const (name, expr) = prop; fmt::printf("\t{} = ", name)!; unparse::expr(os::stdout, &unparse::syn_nowrap, &expr)!; fmt::println(",")!; }; fmt::printfln("\trules = [")!; for (let rule .. ver.rules) { fmt::printfln("\t\t&{},", rule)!; }; fmt::printfln("\t],")!; fmt::printfln("}};")!; }; hare-update-0.25.2.0/cmd/hare-update-genrules/scan.ha000066400000000000000000000210261503370650000221570ustar00rootroot00000000000000use common; use common::{ltok, token, location, nonterminal}; use fmt; use io; use memio; use strings; use v0_25_2::ast; use v0_25_2::lex; use v0_25_2::parse; use v0_25_2::parse::{want, try, peek}; use v0_25_2::unparse; let prevloc = location { line = 1, col = 1, ... }; let stack: []ltok = []; let sp = 0; // Scans and processes tokens until the stack pointer is equal to exit_sp. fn scan( ver: *version, out: io::handle, lex: *lex::lexer, exit_sp: int = -1, expand_vars: bool = true, ) (void | common::error) = { for (true) { const tok = lex::lex(lex)?; let want = ltok::EOF; switch (tok.0) { case ltok::EOF => break; // Extension tokens case ltok::EXT_VERSION => scan_version(out, ver, lex, tok)?; continue; case ltok::EXT_RULE => scan_rule(out, ver, lex, tok)?; continue; case ltok::EXT_MATCH => scan_match(out, ver, lex, tok)?; continue; case ltok::EXT_EDIT => scan_edit(out, ver, lex, tok)?; continue; case ltok::EXT_APPEND => scan_append(out, ver, lex, tok)?; continue; case ltok::EXT_INSERT => scan_insert(out, ver, lex, tok)?; continue; case ltok::EXT_DELETE => scan_delete(out, ver, lex, tok)?; continue; case ltok::EXT_REPLACE => scan_replace(out, ver, lex, tok)?; continue; case ltok::EXT_PRESENT => scan_present(out, ver, lex, tok)?; continue; case ltok::EXT_CHOICE => scan_choice(out, ver, lex, tok)?; continue; case ltok::DOLLAR => if (expand_vars) { scan_variable(out, ver, lex, tok)?; continue; }; // Hare tokens case ltok::LPAREN, ltok::LBRACKET, ltok::LBRACE => append(stack, tok.0)!; sp += 1; case ltok::RPAREN => want = ltok::LPAREN; case ltok::RBRACKET => want = ltok::LBRACKET; case ltok::RBRACE => want = ltok::LBRACE; case => void; }; if (want != ltok::EOF) { let have = stack[sp - 1]; delete(stack[sp - 1]); sp -= 1; if (have != want) { return lex::syntaxerr(tok.2, "unbalanced tokens"); }; if (sp == exit_sp) { break; }; }; pass(out, lex, tok); }; }; fn scan_version( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LPAREN)?; ver.name = want(lex, ltok::LIT_STR)?.1 as str; want(lex, ltok::RPAREN)?; want(lex, ltok::LBRACE)?; for (true) { const tok = want(lex, ltok::NAME, ltok::RBRACE)?; const name = switch (tok.0) { case ltok::RBRACE => break; case ltok::NAME => yield tok.1 as str; case => abort(); }; want(lex, ltok::EQUAL)?; const value = parse::expr(lex)?; append(ver.props, (name, value))!; try(lex, ltok::COMMA)?; }; ver.symbol = version_tosym(ver.name); want(lex, ltok::SEMICOLON)?; prevloc = lex::mkloc(lex); }; fn scan_rule( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LPAREN)?; const name = want(lex, ltok::LIT_STR)?.1 as str; want(lex, ltok::RPAREN)?; want(lex, ltok::DOUBLE_COLON)?; let hooks: []str = []; defer strings::freeall(hooks); for (true) { const hook = want(lex, ltok::LIT_STR)?; const nonterm = common::str_to_nonterminal(hook.1 as str); if (nonterm == nonterminal::NONE) { return lex::syntaxerr(hook.2, "Invalid hook"); }; const hook = fmt::asprintf("nonterminal::{}", common::nonterminal_enum_str(nonterm))!; append(hooks, hook)!; if (try(lex, ltok::COMMA) is void) { break; }; }; const serial = len(ver.rules) + 1; const rule_sym = fmt::asprintf("rule_{}", serial)!; append(ver.rules, rule_sym)!; want(lex, ltok::LBRACE)?; push(ltok::LBRACE); const hooks = strings::join(", ", hooks...)!; fmt::fprintf(out, ` let {0} = rules::rule {{ serial = {1}, name = "{2}", hooks = [{3}], exec = &{0}_exec, remember = -1, }}; fn {0}_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = {{ const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &{0};`, rule_sym, serial, name, hooks)!; prevloc = lex::mkloc(lex); }; fn scan_match( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LBRACE)?; const exit_sp = sp; push(ltok::LBRACE); let patterns: []str = []; defer strings::freeall(patterns); for (true) { const buf = memio::dynamic(); defer io::close(&buf)!; scan(ver, &buf, lex, exit_sp, false)?; const pattern = strings::dup(memio::string(&buf)!)!; append(patterns, pattern)!; const tok = lex::lex(lex)?; switch (tok.0) { case ltok::COMMA => want(lex, ltok::LBRACE)?; push(ltok::LBRACE); case => lex::unlex(lex, tok); break; }; }; fmt::fprintfln(out, ` let __captures = rules::captures {{ ... }}; if (!rules::match_pattern(ctx, &__captures,`)!; for (let pat .. patterns) { fmt::fprint(out, "\t\t\"")!; escape_string(out, strings::trim(pat)); fmt::fprintln(out, "\",")!; }; fmt::fprintfln(out, ` )?) {{ return; }}; defer rules::captures_finish({}.glue, &__captures);`, ver.symbol)!; want(lex, ltok::SEMICOLON)?; prevloc = lex::mkloc(lex); }; fn scan_variable( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { pass_whitespace(out, lex, tok); match (try(lex, ltok::NAME)?) { case let tok: token => const name = tok.1 as str; fmt::fprint(out, `getvar(&__captures, "`)!; escape_string(out, name); fmt::fprint(out, `")`)!; case void => fmt::fprint(out, "__captures")!; }; prevloc = lex::mkloc(lex); }; fn scan_edit( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LBRACE)?; const rule = ver.rules[len(ver.rules)-1]; fmt::fprintf(out, ` {{ let __eg = &rules::editgroup {{ rule = &{}, ... }};`, rule)!; const exit_sp = sp; push(ltok::LBRACE); scan(ver, out, lex, exit_sp)?; fmt::fprint(out, ` yield __eg; }; `)!; want(lex, ltok::SEMICOLON)?; }; fn scan_append( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { pass_whitespace(out, lex, tok); want(lex, ltok::LPAREN)?; push(ltok::LPAREN); fmt::fprint(out, "rules::edit_append(__eg, ")!; prevloc = lex::mkloc(lex); }; fn scan_insert( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { pass_whitespace(out, lex, tok); want(lex, ltok::LPAREN)?; push(ltok::LPAREN); fmt::fprint(out, "rules::edit_insert(__eg, ")!; prevloc = lex::mkloc(lex); }; fn scan_delete( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { pass_whitespace(out, lex, tok); want(lex, ltok::LPAREN)?; push(ltok::LPAREN); fmt::fprint(out, "rules::edit_delete(__eg, ")!; prevloc = lex::mkloc(lex); }; fn scan_replace( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { pass_whitespace(out, lex, tok); want(lex, ltok::LPAREN)?; push(ltok::LPAREN); fmt::fprint(out, "rules::edit_replace(__eg, ")!; prevloc = lex::mkloc(lex); }; fn scan_present( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LPAREN)?; const editgroup = want(lex, ltok::NAME)?.1 as str; want(lex, ltok::COMMA)?; const details = want(lex, ltok::LIT_STR)?.1 as str; want(lex, ltok::RPAREN)?; want(lex, ltok::SEMICOLON)?; fmt::fprintf(out, ` if (rules::present(ctx, {}, "`, editgroup)!; escape_string(out, details); fmt::fprintf(out, `")?) {{ rules::merge_edits(ctx, {}); }};`, editgroup)!; prevloc = lex::mkloc(lex); }; fn scan_choice( out: io::handle, ver: *version, lex: *lex::lexer, tok: token, ) (void | common::error) = { want(lex, ltok::LBRACE)?; let choices: [](str, str) = []; defer { for (let c .. choices) { free(c.0); free(c.1); }; free(choices); }; for (true) { let tok = want(lex, ltok::CASE, ltok::RBRACE)?; if (tok.0 == ltok::RBRACE) { break; }; tok = want(lex, ltok::LIT_STR)?; const text = strings::dup(tok.1 as str)!; want(lex, ltok::ARROW)?; tok = want(lex, ltok::NAME)?; const var = strings::dup(tok.1 as str)!; append(choices, (text, var))!; tok = want(lex, ltok::COMMA, ltok::RBRACE)?; if (tok.0 == ltok::RBRACE) { break; }; }; want(lex, ltok::SEMICOLON)?; prevloc = lex::mkloc(lex); const rule = ver.rules[len(ver.rules)-1]; fmt::fprintf(out, ` match (rules::choose(ctx, &{}, __location, `, rule)!; for (let choice .. choices) { fmt::fprint(out, ` ("`)!; escape_string(out, choice.0); fmt::fprintfln(out, `", {}),`, choice.1)!; }; fmt::fprint(out, ` )?) { case void => return; case let edit: *rules::editgroup => rules::merge_edits(ctx, edit); }; `)!; }; hare-update-0.25.2.0/cmd/hare-update-genrules/util.ha000066400000000000000000000032751503370650000222160ustar00rootroot00000000000000use ascii; use common; use common::{ltok, token, location, nonterminal}; use fmt; use io; use strings; use v0_25_2::lex; fn version_tosym(name: const str) str = { return strings::replace(name, ".", "_")!; }; fn pass(out: io::handle, lex: *lex::lexer, tok: token) void = { pass_whitespace(out, lex, tok); match (tok.1) { case let s: str => switch (tok.0) { case ltok::NAME => fmt::fprintf(out, `{}`, s)!; case ltok::LIT_STR => fmt::fprint(out, '"')!; escape_string(out, s); fmt::fprint(out, '"')!; case => abort(); }; case let rn: rune => fmt::fprint(out, "'")!; escape_rune(out, rn); fmt::fprint(out, "'")!; case let u: u64 => fmt::fprint(out, u)!; case let f: f64 => fmt::fprint(out, f)!; case void => fmt::fprint(out, common::tokstr(tok))!; }; }; fn pass_whitespace(out: io::handle, lex: *lex::lexer, tok: token) void = { const loc = tok.2; for (prevloc.line < loc.line) { fmt::fprint(out, "\n")!; prevloc.line += 1; prevloc.col = 1; }; for (prevloc.col < loc.col) { fmt::fprint(out, " ")!; prevloc.col += 1; }; prevloc = lex::mkloc(lex); }; fn escape_string(out: io::handle, s: str) void = { const iter = strings::iter(s); for (const rn => strings::next(&iter)) { escape_rune(out, rn); }; }; fn escape_rune(out: io::handle, rn: rune) void = { switch (rn) { case '\\' => fmt::fprint(out, `\\`)!; case '\n' => fmt::fprint(out, `\n`)!; case '\r' => fmt::fprint(out, `\r`)!; case '\t' => fmt::fprint(out, `\t`)!; case '"' => fmt::fprint(out, `\"`)!; case => if (ascii::isprint(rn)) { fmt::fprint(out, rn)!; } else { fmt::fprintf(out, `\u{:.4x}`, rn: u32)!; }; }; }; fn push(t: ltok) void = { append(stack, t)!; sp += 1; }; hare-update-0.25.2.0/cmd/hare-update-genrules/version.ha000066400000000000000000000005221503370650000227160ustar00rootroot00000000000000use strings; use v0_25_2::ast; type version = struct { name: str, symbol: str, props: [](str, ast::expr), rules: []str, }; fn version_finish(ver: *version) void = { free(ver.name); free(ver.symbol); for (let prop .. ver.props) { free(prop.0); ast::expr_finish(&prop.1); }; free(ver.props); strings::freeall(ver.rules); }; hare-update-0.25.2.0/cmd/hare-update/000077500000000000000000000000001503370650000170765ustar00rootroot00000000000000hare-update-0.25.2.0/cmd/hare-update/main.ha000066400000000000000000000070751503370650000203450ustar00rootroot00000000000000use common; use errors; use fmt; use fs; use getopt; use os; use path; use rules; use strings; use versions; type mode = enum { APPLY, DIFF, }; export fn main() void = { const help: [_]getopt::help = [ "assist in upgrading a codebase to a newer Hare release", ('d', "output a unified diff of the changes"), ('l', "list available Hare versions"), ('t', "version", "select target Hare version"), ('y', "accept the default answers for each prompt (use with caution)"), ('V', "skip VCS safety check"), "[files...]", ]; const cmd = getopt::parse(os::args, help...); defer getopt::finish(&cmd); let mode = mode::APPLY; let ver = versions::latest; let skipvcs = false; let yes = false; for (let opt .. cmd.opts) { const (opt, val) = opt; switch (opt) { case 'd' => mode = mode::DIFF; case 'l' => for (let ver .. versions::versions) { fmt::printfln("* {0} ({1} => {0})", ver.name, ver.down)!; }; os::exit(os::status::SUCCESS); case 't' => ver = match (versions::get(val)) { case let ver: *rules::version => yield ver; case errors::noentry => fmt::fatalf("Error: unknown version {}", val); }; case 'y' => yes = true; case 'V' => skipvcs = true; case => abort(); }; }; if (mode == mode::APPLY && !skipvcs) { checkvcs(); }; const engine = rules::new_engine(ver, yes); defer rules::destroy(engine); let files = cmd.args; if (len(cmd.args) == 0) { files = []; collect(".", &files); }; defer if (len(cmd.args) == 0) { strings::freeall(files); }; let nedit = 0, nfile = 0; for (let path .. files) { const ctx = rules::new_context(engine, path)!; const doc = match (rules::exec(ctx)) { case let err: common::error => if (err is errors::cancelled) { os::exit(os::status::FAILURE); }; fmt::fatal(common::strerror(err)); case let doc: rules::document => yield doc; }; defer rules::close(&doc); if (len(doc.edits) == 0) { continue; }; for (let edit &.. doc.edits) { rules::apply(&doc, edit); nedit += 1; }; switch (mode) { case mode::DIFF => rules::diff(&doc, os::stdout)!; case mode::APPLY => rules::save(&doc)!; if (!yes) { fmt::errorfln("\nApplied {} edit(s) to {}.\n", len(doc.edits), path)!; }; nfile += 1; }; }; if (mode == mode::APPLY) { if (!yes) { fmt::errorfln("----\n")!; }; fmt::errorfln("Applied {} edit(s) to {} files.", nedit, nfile)!; if (nedit != 0 && !skipvcs) { fmt::errorfln("\n{}{}Remember to review and commit the changes to version control!{}\n", rules::C_BOLD, rules::C_BLUE, rules::C_RESET)!; }; }; }; // Collect all Hare sources under a given directory. fn collect(path: str, files: *[]str) void = { let buf = path::init(path)!; const iter = os::iter(path)!; for (const ent => fs::next(iter)!) { path::set(&buf, path, ent.name)!; const path = path::string(&buf); if (fs::isfile(ent.ftype)) { if (strings::hassuffix(path, ".ha")) { append(files, strings::dup(path)!)!; }; } else if (fs::isdir(ent.ftype)) { collect(path, files); }; }; }; fn checkvcs() void = { const paths = [ ".git", ".hg", ".jj", ".svn", ]; for (const path .. paths) { if (os::exists(path)) { return; }; }; fmt::errorln("Error: this tool should not be used without version control!")!; fmt::errorln("Please do one of the following before running this tool again:")!; fmt::errorln("* Set up a version control system for this codebase")!; fmt::errorln("* Generate a diff with -d instead of applying changes directly")!; fmt::errorln("* Disable this check with -V")!; os::exit(os::status::FAILURE); }; hare-update-0.25.2.0/common/000077500000000000000000000000001503370650000154245ustar00rootroot00000000000000hare-update-0.25.2.0/common/README000066400000000000000000000003401503370650000163010ustar00rootroot00000000000000common:: stores types common to all versions of Hare. For instance, hare::lex::ltok from the standard library has been relocated to common:: and updated to include all tokens from all Hare versions supported by hare-update. hare-update-0.25.2.0/common/error.ha000066400000000000000000000010151503370650000170640ustar00rootroot00000000000000use fmt; use io; // A syntax error export type syntax = !(location, str); // All possible lexer errors export type error = !(io::error | syntax); // Returns a human-friendly string for a given error. The result may be // statically allocated. export fn strerror(err: error) const str = { static let buf: [2048]u8 = [0...]; match (err) { case let err: io::error => return io::strerror(err); case let s: syntax => return fmt::bsprintf(buf, "{}:{}:{}: syntax error: {}", s.0.path, s.0.line, s.0.col, s.1)!; }; }; hare-update-0.25.2.0/common/lex.ha000066400000000000000000000107321503370650000165310ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use io; // A lexical token class. export type ltok = enum uint { // Keep ordered with bmap // Alpha sorted EXT_APPEND, // hare-update EXT_CHOICE, // hare-update EXT_DELETE, // hare-update EXT_EDIT, // hare-update ATTR_FINI, ATTR_INIT, EXT_INSERT, // hare-update EXT_MATCH, // hare-update ATTR_OFFSET, ATTR_PACKED, EXT_PRESENT, // hare-update EXT_REPLACE, // hare-update EXT_RULE, // hare-update ATTR_SYMBOL, ATTR_TEST, ATTR_THREADLOCAL, EXT_VERSION, // hare-update UNDERSCORE, ABORT, ALIGN, ALLOC, APPEND, AS, ASSERT, BOOL, BREAK, CASE, CONST, CONTINUE, DEF, DEFER, DELETE, DONE, ELSE, ENUM, EXPORT, F32, F64, FALSE, FN, FOR, FREE, I16, I32, I64, I8, IF, INSERT, INT, IS, LEN, LET, MATCH, NEVER, NOMEM, NULL, NULLABLE, OFFSET, OPAQUE, RETURN, RUNE, SIZE, STATIC, STR, STRUCT, SWITCH, TRUE, TYPE, U16, U32, U64, U8, UINT, UINTPTR, UNION, USE, VAARG, VAEND, VALIST, VASTART, VOID, YIELD, LAST_KEYWORD = YIELD, // Operators ARROW, BAND, BANDEQ, BNOT, BOR, BOREQ, BXOR, BXOREQ, COLON, COMMA, DIV, DIVEQ, DOLLAR, // hare-update DOT, DOUBLE_COLON, DOUBLE_DOT, ELLIPSIS, EQUAL, GT, GTEQ, LAND, LANDEQ, LBRACE, LBRACKET, LEQUAL, LESS, LESSEQ, LNOT, LOR, LOREQ, LPAREN, LSHIFT, LSHIFTEQ, LXOR, LXOREQ, MINUS, MINUSEQ, MODEQ, MODULO, NEQUAL, PLUS, PLUSEQ, QUESTION, RBRACE, RBRACKET, RPAREN, RSHIFT, RSHIFTEQ, SEMICOLON, TIMES, TIMESEQ, LAST_BTOK = TIMESEQ, LIT_U8, LIT_U16, LIT_U32, LIT_U64, LIT_UINT, LIT_SIZE, LIT_I8, LIT_I16, LIT_I32, LIT_I64, LIT_INT, LIT_ICONST, LIT_F32, LIT_F64, LIT_FCONST, LIT_RCONST, LIT_STR, LAST_LITERAL = LIT_STR, NAME, EOF, }; export const bmap: [_]str = [ // Keep ordered with tok "@append@", // hare-update "@choice@", // hare-update "@delete@", // hare-update "@edit@", // hare-update "@fini", "@init", "@insert@", // hare-update "@match@", // hare-update "@offset", "@packed", "@present@", // hare-update "@replace@", // hare-update "@rule@", // hare-update "@symbol", "@test", "@threadlocal", "@version@", // hare-update "_", "abort", "align", "alloc", "append", "as", "assert", "bool", "break", "case", "const", "continue", "def", "defer", "delete", "done", "else", "enum", "export", "f32", "f64", "false", "fn", "for", "free", "i16", "i32", "i64", "i8", "if", "insert", "int", "is", "len", "let", "match", "never", "nomem", "null", "nullable", "offset", "opaque", "return", "rune", "size", "static", "str", "struct", "switch", "true", "type", "u16", "u32", "u64", "u8", "uint", "uintptr", "union", "use", "vaarg", "vaend", "valist", "vastart", "void", "yield", "=>", "&", "&=", "~", "|", "|=", "^", "^=", ":", ",", "/", "/=", "$", // hare-update ".", "::", "..", "...", "=", ">", ">=", "&&", "&&=", "{", "[", "==", "<", "<=", "!", "||", "||=", "(", "<<", "<<=", "^^", "^^=", "-", "-=", "%=", "%", "!=", "+", "+=", "?", "}", "]", ")", ">>", ">>=", ";", "*", "*=", ]; static assert(len(bmap) == ltok::LAST_BTOK: size + 1); // A token value, used for tokens such as '1337' (an integer). export type value = (str | rune | u64 | f64 | void); // A location within a source file. // The path is borrowed from the file name given to the lexer. export type location = struct { path: str, line: uint, col: uint, off: io::off, }; // A single lexical token. export type token = (ltok, value, location); // Converts a token to its string representation. export fn tokstr(tok: token) const str = { if (tok.0 <= ltok::LAST_BTOK) { return bmap[tok.0: int]; }; switch (tok.0) { case ltok::LIT_U8 => return "u8"; case ltok::LIT_U16 => return "u16"; case ltok::LIT_U32 => return "u32"; case ltok::LIT_U64 => return "u64"; case ltok::LIT_UINT => return "uint"; case ltok::LIT_SIZE => return "size"; case ltok::LIT_I8 => return "i8"; case ltok::LIT_I16 => return "i16"; case ltok::LIT_I32 => return "i32"; case ltok::LIT_I64 => return "i64"; case ltok::LIT_INT => return "int"; case ltok::LIT_ICONST => return "iconst"; case ltok::LIT_F32 => return "f32"; case ltok::LIT_F64 => return "f64"; case ltok::LIT_FCONST => return "fconst"; case ltok::LIT_RCONST => return "rconst"; case ltok::LIT_STR => return "str"; case ltok::NAME => return tok.1 as str; case ltok::EOF => return "EOF"; case => abort(); }; }; hare-update-0.25.2.0/common/nonterminal.ha000066400000000000000000000041101503370650000202600ustar00rootroot00000000000000// A non-terminal in the Hare grammar which the user can use for parser hooks. export type nonterminal = enum uint { NONE, ALLOC_EXPRESSION, // hook data: none APPEND_EXPRESSION, // hook data: none CALL_EXPRESSION, // hook data: ast::expr (lvalue) EXPRESSION, // hook data: none IDENTIFIER, // hook data: none IMPORTS, // hook data: none INSERT_EXPRESSION, // hook data: none FUNCTION_DECLARATION, // hook data: none PROTOTYPE, // hook data: none LAST = PROTOTYPE, // Psuedo-terminals LOCATION, BALANCED, }; export fn str_to_nonterminal(kind: str) nonterminal = { switch (kind) { case "allocation-expression" => return nonterminal::ALLOC_EXPRESSION; case "append-expression" => return nonterminal::APPEND_EXPRESSION; case "call-expression" => return nonterminal::CALL_EXPRESSION; case "expression" => return nonterminal::EXPRESSION; case "identifier" => return nonterminal::IDENTIFIER; case "imports" => return nonterminal::IMPORTS; case "insert-expression" => return nonterminal::INSERT_EXPRESSION; case "function-declaration" => return nonterminal::FUNCTION_DECLARATION; case "prototype" => return nonterminal::PROTOTYPE; case "location" => return nonterminal::LOCATION; case "balanced" => return nonterminal::BALANCED; case => return nonterminal::NONE; }; }; export fn nonterminal_enum_str(nt: nonterminal) const str = { switch (nt) { case nonterminal::ALLOC_EXPRESSION => return "ALLOC_EXPRESSION"; case nonterminal::APPEND_EXPRESSION => return "APPEND_EXPRESSION"; case nonterminal::CALL_EXPRESSION => return "CALL_EXPRESSION"; case nonterminal::EXPRESSION => return "EXPRESSION"; case nonterminal::IDENTIFIER => return "IDENTIFIER"; case nonterminal::IMPORTS => return "IMPORTS"; case nonterminal::INSERT_EXPRESSION => return "INSERT_EXPRESSION"; case nonterminal::FUNCTION_DECLARATION => return "FUNCTION_DECLARATION"; case nonterminal::PROTOTYPE => return "PROTOTYPE"; case nonterminal::LOCATION => return "LOCATION"; case nonterminal::BALANCED => return "BALANCED"; case nonterminal::NONE => return "NONE"; }; }; hare-update-0.25.2.0/glue/000077500000000000000000000000001503370650000150705ustar00rootroot00000000000000hare-update-0.25.2.0/glue/README000066400000000000000000000001061503370650000157450ustar00rootroot00000000000000glue:: provides glue code between various supported versions of Hare. hare-update-0.25.2.0/glue/glue.ha000066400000000000000000000024111503370650000163340ustar00rootroot00000000000000use bufio; use common; use common::{nonterminal}; // hare::lex::lexer export type lexer = opaque; // hare::lex::restore_point export type restore_point = opaque; // The glue interface defines for any given Hare version entry points which can // be used to access that version's equivalent of hare::lex::* et al. export type glue = struct { // Initialize a new lexer. The return value should be placed into the // heap, to be freed with glue.lex_free later. lex_init: *fn(in: *bufio::scanner, path: str) *lexer, // Frees a lexer created with glue.lex_init. lex_free: *fn(lex: *lexer) void, // hare::lex::lex lex_lex: *fn(lex: *lexer) (common::token | common::error), // hare::lex::unlex lex_unlex: *fn(lex: *lexer, tok: common::token) void, // hare::lex::save lex_save: *fn(lex: *lexer) *restore_point, // hare::lex::restore lex_restore: *fn(lex: *lexer, rp: *restore_point) void, // hare::lex::mkloc lex_mkloc: *fn(lex: *lexer) common::location, // Parser glue parse: *fn(lex: *lexer) (void | common::error), parse_register_hook: *fn(target: nonterminal, func: *opaque, user: nullable *opaque) void, parse_nonterminal: *fn(lex: *lexer, kind: nonterminal) (nullable *opaque | common::error), free_nonterminal: *fn(kind: nonterminal, data: nullable *opaque) void, }; hare-update-0.25.2.0/rules/000077500000000000000000000000001503370650000152665ustar00rootroot00000000000000hare-update-0.25.2.0/rules/choice.ha000066400000000000000000000062101503370650000170310ustar00rootroot00000000000000use bufio; use common::{location}; use errors; use fmt; use io; use strconv; def REMEMBER_SKIP = -2; // Prompts the user to choose between several options. export fn choose( ctx: *context, rule: *rule, loc: location, choices: (str, *editgroup)... ) (*editgroup | void | errors::cancelled) = { if (ctx.engine.yes) { return choices[0].1; }; if (rule.remember >= 0) { return choices[rule.remember].1; }; if (rule.remember == REMEMBER_SKIP) { return; }; show_rule(ctx, rule); let ctxlines = 3; let ectx = edit_getcontext(ctx, loc, ctxlines); defer edit_context_finish(&ectx); edit_context_show(ctx, &ectx); const scan = bufio::newscanner(ctx.engine.tty); defer bufio::finish(&scan); for (true) { fmt::fprintln(ctx.engine.tty, "\nSuggested solutions:")!; let num = 1; for (let choice .. choices) { fmt::fprintfln(ctx.engine.tty, "{}: {}", num, choice.0)!; num += 1; }; fmt::fprint(ctx.engine.tty, "Enter choice, or [s]kip, [S]kip always, [p]review all, [q]uit, or [?] (1): ")!; const choice = match (bufio::scan_line(&scan)!) { case let line: str => yield line; case io::EOF => fmt::fatal("Aborted"); }; switch (choice) { case "?" => choose_show_help(ctx); case "+" => ctxlines += 2; edit_context_finish(&ectx); ectx = edit_getcontext(ctx, loc, ctxlines); edit_context_show(ctx, &ectx); fmt::fprintln(ctx.engine.tty)!; case "q" => fmt::errorln("Warning! Edits to some files may have already been written.")!; return errors::cancelled; case "p" => preview_choices(ctx, choices...); case "s" => return; case "S" => rule.remember = REMEMBER_SKIP; return; case => if (choice == "") { choice = "1"; }; const ch = match (strconv::stou(choice)) { case let u: uint => yield u - 1; case => continue; }; if (ch >= len(choices)) { continue; }; const (explanation, choice) = choices[ch]; if (approve(ctx, explanation, rule, choice, ch, &scan)) { return choice; }; }; }; }; fn choose_show_help(ctx: *context) void = { fmt::fprint(ctx.engine.tty, `Enter a number: choose this item [p]: preview suggested edits [+]: add more lines of context [q]: quit, without saving changes to the current file [?]: show this help `)!; }; fn preview_choices( ctx: *context, choices: (str, *editgroup)... ) void = { for (let choice .. choices) { fmt::fprintfln(ctx.engine.tty, "\n{}{}?{}", C_BOLD, choice.0, C_RESET)!; const doc = document_clone(ctx); defer document_destroy(&doc); for (let edit &.. choice.1.edits) { apply(&doc, edit); }; diff(&doc, ctx.engine.tty, 3)!; }; }; fn approve( ctx: *context, explanation: str, rule: *rule, edit: *editgroup, idx: uint, scan: *bufio::scanner, ) bool = { show_context(ctx, edit, explanation, 3); for (true) { fmt::fprint(ctx.engine.tty, " [y]es, [Y]es (always), [n]o (y): ")!; const choice = match (bufio::scan_line(scan)!) { case let line: str => yield line; case io::EOF => fmt::fatal("Aborted"); }; switch (choice) { case "", "y" => return true; case "Y" => rule.remember = idx: int; return true; case "n" => return false; case => void; }; }; }; hare-update-0.25.2.0/rules/colors.ha000066400000000000000000000003361503370650000171030ustar00rootroot00000000000000export def C_BOLD: str = "\x1b[1m"; export def C_BLUE: str = "\x1b[96m"; export def C_RED: str = "\x1b[31m"; export def C_GREEN: str = "\x1b[32m"; export def C_YELLOW: str = "\x1b[33m"; export def C_RESET: str = "\x1b[m"; hare-update-0.25.2.0/rules/context.ha000066400000000000000000000052371503370650000172730ustar00rootroot00000000000000use bufio; use common; use fmt; use fs; use glue; use io; use memio; use os; use unix::tty; export type engine = struct { ver: const *version, tty: io::file, ctx: (*context | void), yes: bool, }; // Creates a new rules engine for applying updates to Hare source code for the // desired target Hare version. Pass the return value to [[destroy]] when you're // done with it. export fn new_engine(ver: const *version, yes: bool = false) *engine = { const tty = match (tty::open()) { case let tty: io::file => yield tty; case let err: tty::error => fmt::fatalf("Error opening TTY: {}", tty::strerror(err)); }; let eng = alloc(engine { ver = ver, tty = tty, ctx = void, yes = yes, })!; const glue = ver.glue; for (let rule .. ver.rules) { for (let hook .. rule.hooks) { glue.parse_register_hook(hook, rule.exec, eng); }; }; return eng; }; // Frees resources associated with an [[engine]]. export fn destroy(eng: *engine) void = { io::close(eng.tty)!; free(eng); }; export fn getcontext(user: nullable *opaque) *context = { const eng = user: *engine; return eng.ctx as *context; }; // Context for applying rules to a single source file. export type context = struct { engine: const *engine, glue: const *glue::glue, path: str, file: io::file, stat: fs::filestat, buffer: *memio::stream, newlines: []io::off, scan: *bufio::scanner, lex: *glue::lexer, edits: []edit, }; // Creates a new context for applying rules to a specific file. Pass the context // to [[exec]] to execute the upgrade rules for this file. export fn new_context( eng: *engine, path: str, ) (*context | fs::error | io::error) = { assert(eng.ctx is void, "Cannot have two contexts active at once"); const glue = eng.ver.glue; const file = os::open(path, fs::flag::RDWR)?; const stat = os::fstat(file)!; let data = io::drain(file)?; let newlines = find_newlines(data); const mbuf = alloc(memio::dynamic_from(data))!; const scan = alloc(bufio::newscanner(mbuf))!; const lex = glue.lex_init(scan, path); const ctx = alloc(context { engine = eng, glue = glue, path = path, file = file, stat = stat, buffer = mbuf, newlines = newlines, scan = scan, lex = lex, edits = [], })!; eng.ctx = ctx; return ctx; }; // Executes rules against a [[context]]. This consumes the [[context]], which // can no longer be used. export fn exec(ctx: *context) (document | common::error) = { defer context_destroy(ctx); ctx.glue.parse(ctx.lex)?; return document_init(ctx); }; fn context_destroy(ctx: *context) void = { let eng = ctx.engine; ctx.glue.lex_free(ctx.lex); bufio::finish(ctx.scan); free(ctx.newlines); free(ctx.buffer); free(ctx.scan); free(ctx); eng.ctx = void; }; hare-update-0.25.2.0/rules/diff.ha000066400000000000000000000030071503370650000165100ustar00rootroot00000000000000use bufio; use fmt; use io; use os::exec; use os; use strconv; use strings; use unix::tty; // Generates a unified diff of this document with all of its applied edits // compared to the base file, and writes it to the given io::handle. export fn diff( doc: *document, out: io::handle, context: uint = 3, ) (void | io::error) = { const cmd = match (exec::cmd("diff", "-U", strconv::utos(context), doc.path, "-")) { case let cmd: exec::command => yield cmd; case exec::error => fmt::fatal("Error spawning diff(1) -- is it installed?"); }; const (in_rd, in_wr) = exec::pipe(); const (out_rd, out_wr) = exec::pipe(); exec::addfile(&cmd, os::stdin_file, in_rd)!; exec::addfile(&cmd, os::stdout_file, out_wr)!; const proc = exec::start(&cmd)!; io::close(in_rd)!; io::close(out_wr)!; io::writeall(in_wr, doc.buffer)!; io::close(in_wr)!; let color = false; match (out) { case let f: io::file => color = tty::isatty(f); case => void; }; if (os::getenv("NOCOLOR") is str) { color = false; }; const scan = bufio::newscanner(out_rd); for (const line => bufio::scan_line(&scan)!) { if (!color) { fmt::fprintln(out, line)!; continue; }; if (strings::hasprefix(line, "-") && !strings::hasprefix(line, "---")) { fmt::fprintfln(out, "{}{}{}", C_RED, line, C_RESET)!; } else if (strings::hasprefix(line, "+") && !strings::hasprefix(line, "+++")) { fmt::fprintfln(out, "{}{}{}", C_GREEN, line, C_RESET)!; } else { fmt::fprintln(out, line)!; }; }; io::close(out_rd)!; exec::wait(&proc)!; }; hare-update-0.25.2.0/rules/document.ha000066400000000000000000000047421503370650000174250ustar00rootroot00000000000000use memio; use io; use strings; // A Hare source document together with a set of proposed edits. export type document = struct { engine: const *engine, path: str, file: io::file, buffer: []u8, edits: []edit, adjust: io::off, }; // Initializes a [[document]] from a context which has been [[exec]]uted. Takes // ownership of the context's path, edit, and file fields, as well as the byte // buffer from its [[memio::stream]]. fn document_init(ctx: *context) document = { let offs = io::tell(ctx.buffer)!; defer io::seek(ctx.buffer, offs, io::whence::SET)!; io::seek(ctx.buffer, 0, io::whence::END)!; let buffer = memio::buffer(ctx.buffer); edits_sort(ctx.edits); return document { engine = ctx.engine, path = ctx.path, file = ctx.file, buffer = buffer, edits = ctx.edits, adjust = 0, }; }; // Initializes a [[document]] which is a copy of the original file from // [[exec]]. fn document_clone(ctx: *context) document = { let offs = io::tell(ctx.buffer)!; defer io::seek(ctx.buffer, offs, io::whence::SET)!; io::seek(ctx.buffer, 0, io::whence::END)!; let buffer = memio::buffer(ctx.buffer); return document { engine = ctx.engine, path = ctx.path, file = ctx.file, buffer = alloc(buffer...)!, edits = [], adjust = 0, }; }; // Frees resources associated with a [[document]] and closes the file handle. export fn close(doc: *document) void = { io::close(doc.file)!; finish(doc); }; // Frees resources associated with a [[document]] (without closing its file // handle). fn finish(doc: *document) void = { for (let edit &.. doc.edits) { edit_finish(edit); }; free(doc.edits); }; // Frees resources associated with a [[document]] created via // [[document_clone]]. fn document_destroy(doc: *document) void = { free(doc.buffer); finish(doc); }; // Applies an edit to a [[document]]'s internal buffer. export fn apply(doc: *document, edit: *edit) void = { if (edit.rem > 0) { const rem = edit.rem: io::off; const start = edit.off + doc.adjust; const end = edit.off + rem + doc.adjust; delete(doc.buffer[start..end]); }; if (edit.ins != "") { const start = edit.off + doc.adjust; insert(doc.buffer[start], strings::toutf8(edit.ins)...)!; }; doc.adjust -= edit.rem: io::off; doc.adjust += len(edit.ins): io::off; }; // Stores the applied edits to a [[document]]'s original on disk. export fn save(doc: *document) (void | io::error) = { io::seek(doc.file, 0, io::whence::SET)?; io::trunc(doc.file, len(doc.buffer))?; io::writeall(doc.file, doc.buffer)?; }; hare-update-0.25.2.0/rules/edit.ha000066400000000000000000000055251503370650000165340ustar00rootroot00000000000000use common; use common::{location}; use io; use sort; use strings; export type editgroupfunc = fn(eg: *editgroup, user: nullable *opaque) void; // A group of related edits. export type editgroup = struct { rule: const *rule, edits: []edit, onmerge: nullable *editgroupfunc, user: nullable *opaque, }; // Merges a group of edits into a [[context]], to be applied later. Consumes the // [[editgroup]] object. export fn merge_edits(ctx: *context, group: *editgroup) void = { match (group.onmerge) { case let func: *editgroupfunc => func(group, group.user); case null => void; }; append(ctx.edits, group.edits...)!; }; // Frees resources associated with an [[editgroup]]. export fn editgroup_finish(group: *editgroup) void = { free(group.edits); free(group.user); }; // An edit to a text file. export type edit = struct { off: io::off, rem: size, ins: str, }; // Frees resources associated with an [[edit]]. export fn edit_finish(e: *edit) void = { free(e.ins); }; // Sort edits in order from smallest to largest offset. export fn edits_sort(edits: []edit) void = { sort::sort(edits, size(edit), &edit_cmp)!; }; fn edit_cmp(a: const *opaque, b: const *opaque) int = { const a = a: const *edit; const b = b: const *edit; if (a.off < b.off) { return -1; } else if (a.off > b.off) { return 1; } else { return 0; }; }; // Adds a callback which is run when an edit group is merged. // // Important: the user object is freed when the edit group is finished. export fn edit_onmerge( group: *editgroup, func: *editgroupfunc, user: nullable *opaque = null, ) void = { assert(group.onmerge == null); group.onmerge = func; group.user = user; }; // Removes text at the given location. export fn edit_delete( group: *editgroup, start: location, end: location, ) void = { assert(end.off > start.off); append(group.edits, edit { off = start.off, rem = (end.off - start.off): size, ins = "", })!; }; // Prepends text just before the given location. export fn edit_prepend( group: *editgroup, loc: location, text: str, ) void = { append(group.edits, edit { off = loc.off - 1, rem = 0, ins = strings::dup(text)!, })!; }; // Insert text at the given location. export fn edit_insert( group: *editgroup, loc: location, text: str, ) void = { append(group.edits, edit { off = loc.off, rem = 0, ins = strings::dup(text)!, })!; }; // Appends text at the given location. export fn edit_append( group: *editgroup, loc: location, text: str, ) void = { append(group.edits, edit { off = loc.off + 1, rem = 0, ins = strings::dup(text)!, })!; }; // Replace the text at the given location. export fn edit_replace( group: *editgroup, start: location, end: location, text: str, ) void = { assert(end.off > start.off); append(group.edits, edit { off = start.off, rem = (end.off - start.off): size, ins = strings::dup(text)!, })!; }; hare-update-0.25.2.0/rules/editctx.ha000066400000000000000000000036601503370650000172510ustar00rootroot00000000000000use bufio; use common::{location}; use fmt; use io; use memio; use sort; use sort::cmp; use strings; use types; type edit_context = struct { buffer: []u8, loc: location, first_line: size, last_line: size, }; fn ndigit(i: size) size = { let digits = 1z; for (i > 0; digits += 1) { i /= 10; }; return digits; }; fn edit_getcontext( ctx: *context, loc: location, nlines: int = 3, ) edit_context = { let first_line = sort::lbisect(ctx.newlines, size(io::off), &loc.off, &cmp::i64s); let last_line = first_line; for (let i = 0; i < nlines && first_line > 0; i += 1) { first_line -= 1; }; for (let i = 0; i < nlines && last_line + 1 < len(ctx.newlines); i += 1) { last_line += 1; }; if (last_line >= len(ctx.newlines)) { last_line = len(ctx.newlines) - 1; }; const min = ctx.newlines[first_line + 1]; const max = ctx.newlines[last_line]; const offs = io::tell(ctx.buffer)!; defer io::seek(ctx.buffer, offs, io::whence::SET)!; io::seek(ctx.buffer, 0, io::whence::END)!; const buffer = memio::buffer(ctx.buffer); return edit_context { first_line = first_line + 1, last_line = last_line + 1, loc = loc, buffer = alloc(buffer[min..max]...)!, }; }; fn edit_context_finish(ectx: *edit_context) void = { free(ectx.buffer); }; fn edit_context_show(ctx: *context, ectx: *edit_context) void = { const buf = memio::fixed(ectx.buffer); const scan = bufio::newscanner(&buf); defer bufio::finish(&scan); const lineno_mods = fmt::mods { pad = ' ', width = ndigit(ectx.last_line), ... }; if (lineno_mods.width == 1) { lineno_mods.width += 1; // => }; fmt::fprintfln(ctx.engine.tty, "{}:{}", ctx.path, ectx.first_line)!; let lineno = ectx.first_line; for (let line => bufio::scan_line(&scan)!) { if (lineno == ectx.loc.line) { fmt::fprintfln(ctx.engine.tty, "{%} | {}", "=>", &lineno_mods, line)!; } else { fmt::fprintfln(ctx.engine.tty, "{%} | {}", lineno, &lineno_mods, line)!; }; lineno += 1; }; }; hare-update-0.25.2.0/rules/match.ha000066400000000000000000000105231503370650000166750ustar00rootroot00000000000000use bufio; use common; use common::{token, ltok, nonterminal}; use fmt; use glue; use io; use memio; use strings; // A captured variable from [[match_tokens]] export type capture = struct { start: common::location, // Inclusive end: common::location, // Exclusive kind: nonterminal, name: str, text: str, data: nullable *opaque, }; // Retrieves the text string from the source file associated with a capture. The // return value is borrowed from the context. export fn capture_gettext(ctx: *context, cap: *capture) str = { const buffer = memio::buffer(ctx.buffer); const data = buffer[cap.start.off..cap.end.off]; return strings::fromutf8(data)!; }; // Result of [[match_tokens]] export type captures = struct { start: common::location, end: common::location, vars: []capture, }; // Retrieves a variable from a [[captures]] list. export fn getvar(c: *captures, name: str) const *capture = { for (let var &.. c.vars) { if (var.name == name) { return var; }; }; fmt::fatalf("Invalid capture variable name '{}'", name); }; // Frees state associated with [[captures]]. export fn captures_finish(glue: *glue::glue, c: *captures) void = { for (let var &.. c.vars) { free(var.name); glue.free_nonterminal(var.kind, var.data); }; free(c.vars); c.vars = []; }; // Matches a token pattern (or patterns) against a lexer. export fn match_pattern( ctx: *context, vars: *captures, patterns: str... ) (bool | common::error) = { for (let pat .. patterns) { const rp = ctx.glue.lex_save(ctx.lex); captures_finish(ctx.glue, vars); if (_match_pattern(ctx, vars, pat)?) { free(rp); return true; }; ctx.glue.lex_restore(ctx.lex, rp); }; return false; }; fn _match_pattern( ctx: *context, vars: *captures, pat: str, ) (bool | common::error) = { const glue = ctx.glue; const lex = ctx.lex; vars.start = glue.lex_mkloc(lex); defer vars.end = glue.lex_mkloc(lex); const in = memio::fixed(strings::toutf8(pat)); const scan = bufio::newscanner(&in); defer bufio::finish(&scan); const ref = glue.lex_init(&scan, ""); defer glue.lex_free(ref); for (true) { let ref_tok = glue.lex_lex(ref)!; switch (ref_tok.0) { case ltok::EOF => break; case ltok::DOLLAR => let var = parse_variable(glue, ref); var.start = glue.lex_mkloc(lex); if (var.name == "*") { ref_tok = glue.lex_lex(ref)!; assert(ref_tok.0 != ltok::EOF); scan_until(ctx, lex, ref_tok); continue; }; match (glue.parse_nonterminal(lex, var.kind)) { case let data: nullable *opaque => var.data = data; var.end = glue.lex_mkloc(lex); case common::error => free(var.name); return false; }; var.text = capture_gettext(ctx, &var); append(vars.vars, var)!; continue; case => void; }; const tok = glue.lex_lex(lex)?; if (ref_tok.0 != tok.0) { glue.lex_unlex(lex, tok); return false; }; switch (ref_tok.0) { case ltok::NAME => const ref_name = ref_tok.1 as str; const name = tok.1 as str; if (name != ref_name) { glue.lex_unlex(lex, tok); return false; }; case => void; }; if (vars.start.line == 0) { vars.start = tok.2; }; }; vars.end = glue.lex_mkloc(lex); return true; }; fn parse_variable( glue: *glue::glue, ref: *glue::lexer, ) capture = { let var = capture { ... }; let tok = glue.lex_lex(ref)!; switch (tok.0) { case ltok::NAME => var.name = strings::dup(tok.1 as str)!; return var; case ltok::TIMES => var.name = strings::dup("*")!; return var; case ltok::LBRACE => void; case => abort("Invalid match pattern"); }; tok = glue.lex_lex(ref)!; switch (tok.0) { case ltok::NAME => var.name = strings::dup(tok.1 as str)!; tok = glue.lex_lex(ref)!; case => void; }; switch (tok.0) { case ltok::RBRACE => assert(var.name != "", "Invalid match pattern"); return var; case ltok::COLON => void; case => abort("Invalid match pattern"); }; tok = glue.lex_lex(ref)!; assert(tok.0 == ltok::LIT_STR, "Invalid match pattern"); var.kind = common::str_to_nonterminal(tok.1 as str); assert(var.kind != nonterminal::NONE, "Invalid match pattern"); tok = glue.lex_lex(ref)!; assert(tok.0 == ltok::RBRACE, "Invalid match pattern"); return var; }; fn scan_until( ctx: *context, lex: *glue::lexer, tok: token, ) void = { assert(tok.1 is void); // TODO? for (true) { let next = ctx.glue.lex_lex(lex)!; if (next.0 == tok.0) { break; }; }; }; hare-update-0.25.2.0/rules/newline.ha000066400000000000000000000020631503370650000172420ustar00rootroot00000000000000use ascii; use common::{location}; use io; use memio; use sort; use sort::cmp; use strings; fn find_newlines(data: []u8) []io::off = { let lf: []io::off = []; append(lf, 0)!; let o: io::off = 0; for (let b .. data) { if (b == '\n') { append(lf, o)!; }; o += 1; }; return lf; }; // Finds the [[location]] of the first character on this line. export fn line_start(ctx: *context, loc: location) location = { let line = sort::lbisect(ctx.newlines, size(io::off), &loc.off, &cmp::i64s); for (line >= len(ctx.newlines)) line -= 1; const off = ctx.newlines[line]; return location { path = loc.path, line = loc.line, col = 1, off = off + 1, }; }; // Finds the [[location]] of the newline character at the end of this line. export fn line_end(ctx: *context, loc: location) location = { let line = sort::rbisect(ctx.newlines, size(io::off), &loc.off, &cmp::i64s); for (line >= len(ctx.newlines)) line -= 1; const off = ctx.newlines[line]; return location { path = loc.path, line = loc.line, col = (off - loc.off): uint, off = off, }; }; hare-update-0.25.2.0/rules/prompt.ha000066400000000000000000000047271503370650000171330ustar00rootroot00000000000000use bufio; use common::{location}; use errors; use fmt; use io; use memio; use os; use sort; use sort::cmp; use types; use strings; def REMEMBER_UNDEF: int = -1; def REMEMBER_NO: int = 0; def REMEMBER_YES: int = 1; // Presents an [[editgroup]] for user approval. Returns true if the edits should // be merged with the context. export fn present( ctx: *context, proposal: *editgroup, explanation: str, ) (bool | errors::cancelled) = { if (ctx.engine.yes) { return true; }; let rule = proposal.rule; switch (rule.remember) { case REMEMBER_NO => return false; case REMEMBER_YES => return true; case => void; }; let ctxlines = 3u; show_context(ctx, proposal, explanation, ctxlines); fmt::fprint(ctx.engine.tty, "\n")!; const scan = bufio::newscanner(ctx.engine.tty); defer bufio::finish(&scan); for (true) { fmt::fprint(ctx.engine.tty, "[y]es [n]o [Y]es (always) [N]o (always) [q]uit [?] (y): ")!; const choice = match (bufio::scan_line(&scan)!) { case let line: str => yield line; case io::EOF => fmt::fatal("Aborted"); }; switch (choice) { case "", "y" => return true; case "Y" => rule.remember = REMEMBER_YES; return true; case "n" => return false; case "N" => rule.remember = REMEMBER_NO; return false; case "?" => prompt_show_help(ctx); case "+" => ctxlines += 2; show_context(ctx, proposal, explanation, ctxlines); fmt::fprint(ctx.engine.tty, "\n")!; case "q" => fmt::errorln("Warning! Edits to some files may have already been written.")!; return errors::cancelled; case => void; }; }; }; fn prompt_show_help(ctx: *context) void = { fmt::fprint(ctx.engine.tty, `[y]es: accept edit once [Y]es: always accept proposed edit for this rule [n]o: reject edit once [N]o: always reject proposed edit for this rule [+]: add more lines of context [q]: quit, without saving changes to the current file [?]: show this help `)!; }; fn show_rule(ctx: *context, rule: *rule) void = { fmt::fprintfln(ctx.engine.tty, "{}{}-{:.3}{}: {}{}{}\n", C_BLUE, ctx.engine.ver.name, rule.serial, C_RESET, C_BOLD, rule.name, C_RESET)!; }; fn show_context( ctx: *context, proposal: *editgroup, explanation: str, ctxlines: uint, ) void = { show_rule(ctx, proposal.rule); const doc = document_clone(ctx); defer document_destroy(&doc); for (let edit &.. proposal.edits) { apply(&doc, edit); }; diff(&doc, ctx.engine.tty, ctxlines)!; fmt::fprintf(ctx.engine.tty, "\n{}{}?{}", C_BOLD, explanation, C_RESET)!; }; hare-update-0.25.2.0/rules/rule.ha000066400000000000000000000002151503370650000165450ustar00rootroot00000000000000use common::{nonterminal}; export type rule = struct { serial: uint, name: str, hooks: []nonterminal, exec: *opaque, remember: int, }; hare-update-0.25.2.0/rules/version.ha000066400000000000000000000002761503370650000172720ustar00rootroot00000000000000use glue; // A Hare version which can be the target of an update operation. export type version = struct { name: const str, down: const str, glue: *glue::glue, rules: []const *rule, }; hare-update-0.25.2.0/rules/warn.ha000066400000000000000000000004111503370650000165430ustar00rootroot00000000000000use fmt; // Print a warning to the console. export fn warning(ctx: *context, rule: *rule, warn: str) void = { if (ctx.engine.yes || rule.remember >= 0) { return; }; fmt::fprintfln(ctx.engine.tty, "{}{}Warning: {}{}", C_YELLOW, C_BOLD, warn, C_RESET)!; }; hare-update-0.25.2.0/v0_25_2/000077500000000000000000000000001503370650000152105ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/README000066400000000000000000000000001503370650000160560ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/ast/000077500000000000000000000000001503370650000157775ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/ast/README000066400000000000000000000000721503370650000166560ustar00rootroot00000000000000hare::ast provides an abstract syntax tree for Hare code. hare-update-0.25.2.0/v0_25_2/ast/decl.ha000066400000000000000000000041441503370650000172230ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; // A constant declaration. // // def foo: int = 0; export type decl_const = struct { ident: ident, _type: nullable *_type, init: *expr, }; // A global declaration. // // let foo: int = 0; // const foo: int = 0; export type decl_global = struct { is_const: bool, is_threadlocal: bool, symbol: str, ident: ident, _type: nullable *_type, init: nullable *expr, }; // A type declaration. // // type foo = int; export type decl_type = struct { ident: ident, _type: *_type, }; // Attributes applicable to a function declaration. export type fndecl_attr = enum { NONE, FINI, INIT, TEST, }; // A function declaration. // // fn main() void = void; export type decl_func = struct { symbol: str, ident: ident, prototype: *_type, body: nullable *expr, attrs: fndecl_attr, }; // A Hare declaration. export type decl = struct { exported: bool, start: common::location, end: common::location, decl: ([]decl_const | []decl_global | []decl_type | decl_func | assert_expr), // Only valid if the lexer has comments enabled docs: str, }; // Frees resources associated with a declaration. export fn decl_finish(d: decl) void = { free(d.docs); match (d.decl) { case let g: []decl_global => for (let i = 0z; i < len(g); i += 1) { free(g[i].symbol); ident_free(g[i].ident); type_finish(g[i]._type); free(g[i]._type); expr_finish(g[i].init); free(g[i].init); }; free(g); case let t: []decl_type => for (let i = 0z; i < len(t); i += 1) { ident_free(t[i].ident); type_finish(t[i]._type); free(t[i]._type); }; free(t); case let f: decl_func => free(f.symbol); ident_free(f.ident); type_finish(f.prototype); free(f.prototype); expr_finish(f.body); free(f.body); case let c: []decl_const => for (let i = 0z; i < len(c); i += 1) { ident_free(c[i].ident); type_finish(c[i]._type); free(c[i]._type); expr_finish(c[i].init); free(c[i].init); }; free(c); case let e: assert_expr => expr_finish(e.cond); free(e.cond); expr_finish(e.message); free(e.message); }; }; hare-update-0.25.2.0/v0_25_2/ast/expr.ha000066400000000000000000000334461503370650000173010ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; // An identifier access expression. // // foo export type access_identifier = ident; // An index access expression. // // foo[0] export type access_index = struct { object: *expr, index: *expr, }; // A struct field access expression. // // foo.bar export type access_field = struct { object: *expr, field: str, }; // A tuple field access expression. // // foo.1 export type access_tuple = struct { object: *expr, value: *expr, }; // An access expression. export type access_expr = (access_identifier | access_index | access_field | access_tuple); // An align expression. // // align(int) export type align_expr = *_type; // The form of an allocation expression. // // alloc(foo) // OBJECT // alloc(foo...) // COPY export type alloc_form = enum { OBJECT, COPY, }; // An allocation expression. // // alloc(foo) // alloc(foo...) // alloc(foo, bar) export type alloc_expr = struct { init: *expr, form: alloc_form, capacity: nullable *expr, }; // An append expression. // // append(foo, bar); // append(foo, bar...); // append(foo, [0...], bar); export type append_expr = struct { object: *expr, value: *expr, length: nullable *expr, variadic: bool, is_static: bool, }; // An assertion expression. // // assert(foo) // assert(foo, "error") // abort() // abort("error") export type assert_expr = struct { cond: nullable *expr, message: nullable *expr, is_static: bool, }; // An assignment expression. // // foo = bar export type assign_expr = struct { op: (binarithm_op | void), object: *expr, value: *expr, }; // A binary arithmetic operator export type binarithm_op = enum { // TODO: Rehome this with the checked AST? BAND, // & BOR, // | DIV, // / GT, // > GTEQ, // >= LAND, // && LEQUAL, // == LESS, // < LESSEQ, // <= LOR, // || LSHIFT, // << LXOR, // ^^ MINUS, // - MODULO, // % NEQUAL, // != PLUS, // + RSHIFT, // >> TIMES, // * BXOR, // ^ }; // A binary arithmetic expression. // // foo * bar export type binarithm_expr = struct { op: binarithm_op, lvalue: *expr, rvalue: *expr, }; // A single variable biding. // // foo: int = bar // (foo, foo2): int = bar export type binding = struct { name: (str | binding_unpack), _type: nullable *_type, init: *expr, }; // Tuple unpacking binding. // // (foo, _, bar) export type binding_unpack = [](str | void); // The kind of binding expression being used. export type binding_kind = enum { CONST, DEF, LET, }; // A variable binding expression. // // let foo: int = bar, ... export type binding_expr = struct { is_static: bool, kind: binding_kind, bindings: []binding, }; // A break expression. The label is set to empty string if absent. // // break :label export type break_expr = label; // A function call expression. // // foo(bar) export type call_expr = struct { lvalue: *expr, variadic: bool, args: []*expr, }; // The kind of cast expression being used. export type cast_kind = enum { // TODO: Should this be rehomed with the checked AST? CAST, ASSERTION, TEST, }; // A cast expression. // // foo: int // foo as int // foo is int export type cast_expr = struct { kind: cast_kind, value: *expr, _type: *_type, }; // A compound expression. // // { // foo; // bar; // // ... // } export type compound_expr = struct { exprs: []*expr, label: label, }; // An array literal. // // [foo, bar, ...] export type array_literal = struct { expand: bool, values: []*expr, }; // A single struct field and value. // // foo: int = 10 export type struct_value = struct { name: str, _type: nullable *_type, init: *expr, }; // A struct literal. // // struct { foo: int = bar, struct { baz = quux }, ... } export type struct_literal = struct { autofill: bool, alias: ident, // [] for anonymous fields: [](struct_value | *struct_literal), }; // A tuple literal. // // (foo, bar, ...) export type tuple_literal = []*expr; // The value "null". export type _null = void; // A scalar value. export type value = (bool | done | nomem |_null | str | rune | void); // An integer or float literal. export type number_literal = struct { suff: common::ltok, value: (i64 | u64 | f64), sign: bool, // true if negative, false otherwise }; // A literal expression. export type literal_expr = (value | array_literal | number_literal | struct_literal | tuple_literal); // A continue expression. The label is set to empty string if absent. // // continue :label export type continue_expr = label; // A deferred expression. // // defer foo export type defer_expr = *expr; // A delete expression. // // delete(foo[10]) // delete(foo[4..42]) export type delete_expr = struct { object: *expr, is_static: bool, }; // The kind of for expression being used. export type for_kind = enum { ACCUMULATOR, EACH_VALUE, EACH_POINTER, ITERATOR, }; // A for loop. // // for (let foo = 0; foo < bar; baz) quux // for (let line => next_line()) quux // for (let number .. [1, 2, 3]) quux // for (let ptr &.. [1, 2, 3]) quux export type for_expr = struct { kind: for_kind, bindings: nullable *expr, cond: nullable *expr, afterthought: nullable *expr, body: *expr, label: label, }; // A free expression. // // free(foo) export type free_expr = *expr; // An if or if..else expression. // // if (foo) bar else baz export type if_expr = struct { cond: *expr, tbranch: *expr, fbranch: nullable *expr, }; // An insert expression. // // insert(foo[0], bar); // insert(foo[0], bar...); // insert(foo[0], [0...], bar); export type insert_expr = append_expr; // :label. The ":" character is not included. export type label = str; // A length expression. // // len(foo) export type len_expr = *expr; // A match case. // // case type => exprs // case let name: type => exprs export type match_case = struct { name: str, _type: nullable *_type, // null for default case exprs: []*expr, }; // A match expression. // // match (foo) { case int => bar; ... } export type match_expr = struct { value: *expr, cases: []match_case, label: label, }; // An offset expression. // // offset(foo.bar) export type offset_expr = *expr; // An error propagation expression. // // foo? export type propagate_expr = *expr; // An error assertion expression. // // foo! export type error_assert_expr = *expr; // A return statement. // // return foo export type return_expr = nullable *expr; // A size expression. // // size(int) export type size_expr = *_type; // A slicing expression. // // foo[bar..baz] export type slice_expr = struct { object: *expr, start: nullable *expr, end: nullable *expr, }; // A switch case. // // case value => exprs export type switch_case = struct { options: []*expr, // [] for default case exprs: []*expr, }; // A switch expression. // // switch (foo) { case bar => baz; ... } export type switch_expr = struct { value: *expr, cases: []switch_case, label: label, }; // A unary operator export type unarithm_op = enum { // TODO: Should this be rehomed with the checked AST? ADDR, // & BNOT, // ~ DEREF, // * LNOT, // ! MINUS, // - }; // A unary arithmetic expression. // // !example export type unarithm_expr = struct { op: unarithm_op, operand: *expr, }; // A vastart expression. // // vastart() export type vastart_expr = void; // A vaarg expression. // // vaarg(ap, int) export type vaarg_expr = struct { ap: *expr, _type: *_type, }; // A vaend expression. // // vaend(ap) export type vaend_expr = *expr; // A C-style variadic expression. export type variadic_expr = (vastart_expr | vaarg_expr | vaend_expr); // A yield expression. // // yield foo export type yield_expr = struct { label: label, value: nullable *expr, }; // A Hare expression. export type expr = struct { start: common::location, end: common::location, expr: (access_expr | align_expr | alloc_expr | append_expr | assert_expr | assign_expr | binarithm_expr | binding_expr | break_expr | call_expr | cast_expr | literal_expr | continue_expr | defer_expr | delete_expr | for_expr | free_expr | error_assert_expr | if_expr | insert_expr | compound_expr | match_expr | len_expr | size_expr | offset_expr | propagate_expr | return_expr | slice_expr | switch_expr | unarithm_expr | variadic_expr | yield_expr), }; // Frees resources associated with a Hare [[expr]]ession. export fn expr_finish(e: nullable *expr) void = { match (e) { case null => void; case let e: *expr => match (e.expr) { case let a: access_expr => match (a) { case let i: access_identifier => ident_free(i); case let i: access_index => expr_finish(i.object); free(i.object); expr_finish(i.index); free(i.index); case let f: access_field => expr_finish(f.object); free(f.object); free(f.field); case let t: access_tuple => expr_finish(t.object); free(t.object); expr_finish(t.value); free(t.value); }; case let a: align_expr => type_finish(a); free(a); case let a: alloc_expr => expr_finish(a.init); free(a.init); expr_finish(a.capacity); free(a.capacity); case let a: append_expr => expr_finish(a.object); free(a.object); expr_finish(a.value); free(a.value); expr_finish(a.length); free(a.length); case let a: assert_expr => expr_finish(a.cond); free(a.cond); expr_finish(a.message); free(a.message); case let a: assign_expr => expr_finish(a.object); free(a.object); expr_finish(a.value); free(a.value); case let b: binarithm_expr => expr_finish(b.lvalue); free(b.lvalue); expr_finish(b.rvalue); free(b.rvalue); case let b: binding_expr => for (let i = 0z; i < len(b.bindings); i += 1) { match (b.bindings[i].name) { case let s: str => free(s); case let u: binding_unpack => for (let i = 0z; i < len(u); i += 1) { match (u[i]) { case let s: str => free(s); case => void; }; }; free(u); }; type_finish(b.bindings[i]._type); free(b.bindings[i]._type); expr_finish(b.bindings[i].init); free(b.bindings[i].init); }; free(b.bindings); case let b: break_expr => free(b); case let c: call_expr => expr_finish(c.lvalue); free(c.lvalue); for (let i = 0z; i < len(c.args); i += 1) { expr_finish(c.args[i]); free(c.args[i]); }; free(c.args); case let c: cast_expr => expr_finish(c.value); free(c.value); type_finish(c._type); free(c._type); case let c: compound_expr => for (let i = 0z; i < len(c.exprs); i += 1) { expr_finish(c.exprs[i]); free(c.exprs[i]); }; free(c.exprs); free(c.label); case let c: literal_expr => match (c) { case let a: array_literal => for (let i = 0z; i < len(a.values); i += 1) { expr_finish(a.values[i]); free(a.values[i]); }; free(a.values); case let s: struct_literal => struct_literal_finish(&s); case let t: tuple_literal => for (let i = 0z; i < len(t); i += 1) { expr_finish(t[i]); free(t[i]); }; free(t); case (value | number_literal) => void; }; case let c: continue_expr => free(c); case let d: defer_expr => expr_finish(d); free(d); case let d: delete_expr => expr_finish(d.object); free(d.object); case let e: error_assert_expr => expr_finish(e); free(e); case let f: for_expr => expr_finish(f.bindings); free(f.bindings); expr_finish(f.cond); free(f.cond); expr_finish(f.afterthought); free(f.afterthought); expr_finish(f.body); free(f.body); case let f: free_expr => expr_finish(f); free(f); case let i: if_expr => expr_finish(i.cond); free(i.cond); expr_finish(i.tbranch); free(i.tbranch); expr_finish(i.fbranch); free(i.fbranch); case let e: insert_expr => expr_finish(e.object); free(e.object); expr_finish(e.value); free(e.value); expr_finish(e.length); free(e.length); case let l: len_expr => expr_finish(l); free(l); case let m: match_expr => free(m.label); expr_finish(m.value); free(m.value); for (let i = 0z; i < len(m.cases); i += 1) { free(m.cases[i].name); type_finish(m.cases[i]._type); free(m.cases[i]._type); const exprs = m.cases[i].exprs; for (let i = 0z; i < len(exprs); i += 1) { expr_finish(exprs[i]); free(exprs[i]); }; free(exprs); }; free(m.cases); case let o: offset_expr => expr_finish(o); free(o); case let p: propagate_expr => expr_finish(p); free(p); case let r: return_expr => expr_finish(r); free(r); case let s: size_expr => type_finish(s); free(s); case let s: slice_expr => expr_finish(s.object); free(s.object); expr_finish(s.start); free(s.start); expr_finish(s.end); free(s.end); case let s: switch_expr => free(s.label); expr_finish(s.value); free(s.value); for (let i = 0z; i < len(s.cases); i += 1) { let opts = s.cases[i].options; for (let j = 0z; j < len(opts); j += 1) { expr_finish(opts[j]); free(opts[j]); }; free(opts); let exprs = s.cases[i].exprs; for (let j = 0z; j < len(exprs); j += 1) { expr_finish(exprs[j]); free(exprs[j]); }; free(exprs); }; free(s.cases); case let u: unarithm_expr => expr_finish(u.operand); free(u.operand); case let v: variadic_expr => match (v) { case vastart_expr => void; case let v: vaarg_expr => expr_finish(v.ap); free(v.ap); type_finish(v._type); free(v._type); case let v: vaend_expr => expr_finish(v); free(v); }; case let y: yield_expr => free(y.label); expr_finish(y.value); free(y.value); }; }; }; fn struct_literal_finish(s: *struct_literal) void = { ident_free(s.alias); for (let i = 0z; i < len(s.fields); i += 1) { match (s.fields[i]) { case let v: struct_value => free(v.name); type_finish(v._type); free(v._type); expr_finish(v.init); free(v.init); case let c: *struct_literal => struct_literal_finish(c); free(c); }; }; free(s.fields); }; hare-update-0.25.2.0/v0_25_2/ast/ident.ha000066400000000000000000000015061503370650000174160ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use strings; // Identifies a single object, e.g. foo::bar::baz. export type ident = []str; // Maximum length of an identifier, as the sum of the lengths of its parts plus // one for each namespace deliniation. // // In other words, the length of "a::b::c" is 5. export def IDENT_MAX: size = 255; // Frees resources associated with an [[ident]]ifier. export fn ident_free(ident: ident) void = strings::freeall(ident); // Returns true if two [[ident]]s are identical. export fn ident_eq(a: ident, b: ident) bool = { if (len(a) != len(b)) { return false; }; for (let i = 0z; i < len(a); i += 1) { if (a[i] != b[i]) { return false; }; }; return true; }; // Duplicates an [[ident]]. export fn ident_dup(id: ident) ident = strings::dupall(id)!; hare-update-0.25.2.0/v0_25_2/ast/import.ha000066400000000000000000000022071503370650000176240ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use strings; // An imported module. // // use foo; // use foo = bar; // use foo::{bar, baz}; // use foo::*; export type import = struct { start: common::location, end: common::location, ident: ident, bindings: (void | import_alias | import_members | import_wildcard), }; // An import alias. // // use foo = bar; export type import_alias = str; // An import members list. // // use foo::{bar, baz}; export type import_members = []str; // An import wildcard. // // use foo::*; export type import_wildcard = void; // Frees resources associated with an [[import]]. export fn import_finish(import: import) void = { ident_free(import.ident); match (import.bindings) { case let alias: import_alias => free(alias); case let objects: import_members => strings::freeall(objects); case => void; }; }; // Frees resources associated with each [[import]] in a slice, and then // frees the slice itself. export fn imports_finish(imports: []import) void = { for (let i = 0z; i < len(imports); i += 1) { import_finish(imports[i]); }; free(imports); }; hare-update-0.25.2.0/v0_25_2/ast/type.ha000066400000000000000000000111521503370650000172720ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; // A type alias. export type alias_type = struct { unwrap: bool, ident: ident, }; // A built-in primitive type (int, bool, str, etc). export type builtin_type = enum { BOOL, DONE, F32, F64, FCONST, I16, I32, I64, I8, ICONST, INT, NEVER, NOMEM, NULL, OPAQUE, RCONST, RUNE, SIZE, STR, U16, U32, U64, U8, UINT, UINTPTR, VALIST, VOID }; // An enumeration field (and optional value). export type enum_field = struct { name: str, value: nullable *expr, loc: common::location, docs: str, }; // enum { FOO = 0, BAR, ... } export type enum_type = struct { storage: builtin_type, values: []enum_field, }; // The variadism strategy for a function type. export type variadism = enum { NONE, C, HARE, }; // A parameter to a function type. export type func_param = struct { loc: common::location, name: str, _type: *_type, default_value: (void | expr), }; // fn(foo: int, baz: int...) int export type func_type = struct { result: *_type, variadism: variadism, params: []func_param, }; // The length for a list type which is a slice (e.g. []int). export type len_slice = void; // The length for a list type which is unbounded (e.g. [*]int). export type len_unbounded = void; // The length for a list type which is inferred from context (e.g. [_]int). export type len_contextual = void; // []int, [*]int, [_]int, [foo]int export type list_type = struct { length: (*expr | len_slice | len_unbounded | len_contextual), members: *_type, }; // Flags which apply to a pointer type. export type pointer_flag = enum uint { NONE = 0, NULLABLE = 1 << 0, }; // *int export type pointer_type = struct { referent: *_type, flags: pointer_flag, }; // A single field of a struct type. export type struct_field = struct { name: str, _type: *_type, }; // An embedded struct type. export type struct_embedded = *_type; // An embedded type alias. export type struct_alias = ident; // struct { @offset(10) foo: int, struct { bar: int }, baz::quux } export type struct_member = struct { _offset: nullable *expr, member: (struct_field | struct_embedded | struct_alias), // Only valid if the lexer has comments enabled docs: str, }; // struct { ... } export type struct_type = struct { packed: bool, members: []struct_member, }; // union { ... } export type union_type = []struct_member; export type struct_union_type = (struct_type | union_type); // (int | bool) export type tagged_type = []*_type; // (int, bool, ...) export type tuple_type = []*_type; // Flags which apply to types. export type type_flag = enum uint { NONE = 0, CONST = 1 << 0, ERROR = 1 << 1, }; // A Hare type. export type _type = struct { start: common::location, end: common::location, flags: type_flag, repr: (alias_type | builtin_type | enum_type | func_type | list_type | pointer_type | struct_type | union_type | tagged_type | tuple_type), }; fn struct_members_free(membs: []struct_member) void = { for (let i = 0z; i < len(membs); i += 1) { free(membs[i].docs); expr_finish(membs[i]._offset); free(membs[i]._offset); match (membs[i].member) { case let f: struct_field => free(f.name); type_finish(f._type); free(f._type); case let e: struct_embedded => type_finish(e); free(e); case let a: struct_alias => ident_free(a); }; }; free(membs); }; // Frees resources associated with a [[_type]]. export fn type_finish(t: nullable *_type) void = { match (t) { case null => void; case let t: *_type => match (t.repr) { case let a: alias_type => ident_free(a.ident); case builtin_type => void; case let e: enum_type => for (let i = 0z; i < len(e.values); i += 1) { free(e.values[i].name); expr_finish(e.values[i].value); free(e.values[i].value); }; free(e.values); case let f: func_type => type_finish(f.result); free(f.result); for (let i = 0z; i < len(f.params); i += 1) { free(f.params[i].name); type_finish(f.params[i]._type); free(f.params[i]._type); }; free(f.params); case let l: list_type => match (l.length) { case let e: *expr => expr_finish(e); free(e); case => void; }; type_finish(l.members); free(l.members); case let p: pointer_type => type_finish(p.referent); free(p.referent); case let s: struct_type => struct_members_free(s.members); case let t: tagged_type => for (let i = 0z; i < len(t); i += 1) { type_finish(t[i]); free(t[i]); }; free(t); case let t: tuple_type => for (let i = 0z; i < len(t); i += 1) { type_finish(t[i]); free(t[i]); }; free(t); case let u: union_type => struct_members_free(u); }; }; }; hare-update-0.25.2.0/v0_25_2/ast/unit.ha000066400000000000000000000006571503370650000173000ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors // A sub-unit, typically representing a single source file. export type subunit = struct { imports: []import, decls: []decl, }; // Frees resources associated with a [[subunit]]. export fn subunit_finish(u: subunit) void = { imports_finish(u.imports); for (let i = 0z; i < len(u.decls); i += 1) { decl_finish(u.decls[i]); }; free(u.decls); }; hare-update-0.25.2.0/v0_25_2/glue.ha000066400000000000000000000053741503370650000164670ustar00rootroot00000000000000use bufio; use common; use common::{nonterminal}; use glue; use v0_25_2::ast; use v0_25_2::lex; use v0_25_2::parse; export const glue = glue::glue { lex_init = &lex_init, lex_free = &lex_free, lex_lex = &lex_lex, lex_unlex = &lex_unlex, lex_save = &lex_save, lex_restore = &lex_restore, lex_mkloc = &lex_mkloc, parse = &do_parse, parse_register_hook = &parse_register_hook, parse_nonterminal = &parse_nonterminal, free_nonterminal = &free_nonterminal, }; fn lex_init(scan: *bufio::scanner, path: str) *glue::lexer = { return alloc(lex::init(scan, path))!: *glue::lexer; }; fn lex_free(lex: *glue::lexer) void = { free(lex); }; fn lex_lex(lex: *glue::lexer) (common::token | common::error) = { const lexer = lex: *lex::lexer; return lex::lex(lexer)?; }; fn lex_unlex(lex: *glue::lexer, tok: common::token) void = { const lexer = lex: *lex::lexer; lex::unlex(lexer, tok); }; fn lex_save(lex: *glue::lexer) *glue::restore_point = { const lexer = lex: *lex::lexer; return alloc(lex::save(lexer)!)!: *glue::restore_point; }; fn lex_restore(lex: *glue::lexer, rp: *glue::restore_point) void = { const lexer = lex: *lex::lexer; const rp = rp: *lex::restore_point; lex::restore(lexer, rp)!; free(rp); }; fn lex_mkloc(lex: *glue::lexer) common::location = { return lex::mkloc(lex: *lex::lexer); }; fn do_parse(lex: *glue::lexer) (void | common::error) = { const lex = lex: *lex::lexer; match (parse::subunit(lex)) { case let su: ast::subunit => ast::subunit_finish(su); case let err: parse::error => return err; }; }; fn parse_register_hook( target: nonterminal, func: *opaque, user: nullable *opaque, ) void = { parse::register_hook(target, func: *parse::hookfunc, user); }; fn parse_nonterminal( lex: *glue::lexer, kind: nonterminal, ) (nullable *opaque | common::error) = { const lex = lex: *lex::lexer; switch (kind) { case nonterminal::ALLOC_EXPRESSION => return alloc(parse::expr(lex)?)!; case nonterminal::CALL_EXPRESSION => return alloc(parse::expr(lex)?)!; case nonterminal::EXPRESSION => return alloc(parse::expr(lex)?)!; case nonterminal::IDENTIFIER => return alloc(parse::ident(lex)?)!; case nonterminal::LOCATION => return null; case nonterminal::BALANCED => lex::skip_balanced(lex)?; return null; case => abort(); }; }; fn free_nonterminal( kind: nonterminal, data: nullable *opaque, ) void = { switch (kind) { case nonterminal::ALLOC_EXPRESSION => ast::expr_finish(data: *ast::expr); case nonterminal::CALL_EXPRESSION => ast::expr_finish(data: *ast::expr); case nonterminal::EXPRESSION => ast::expr_finish(data: *ast::expr); case nonterminal::IDENTIFIER => ast::ident_free(*(data: *ast::ident)); case nonterminal::LOCATION => return; case nonterminal::BALANCED => return; case => abort(); }; free(data); }; hare-update-0.25.2.0/v0_25_2/lex/000077500000000000000000000000001503370650000160005ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/lex/README000066400000000000000000000003131503370650000166550ustar00rootroot00000000000000hare::lex provides a lexer for Hare source code. A lexer takes an [[io::handle]] and returns a series of Hare [[token]]s. See the Hare specification for more details: https://harelang.org/specification hare-update-0.25.2.0/v0_25_2/lex/lex.ha000066400000000000000000000561211503370650000171070ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use ascii; use bufio; use common::*; use encoding::utf8; use fmt; use io; use memio; use path; use sort; use sort::cmp; use strconv; use strings; use types; export type lexer = struct { in: *bufio::scanner, path: str, loc: (uint, uint, io::off), prevrloc: (uint, uint, io::off), un: token, // ltok::EOF when no token was unlexed prevunlocs: [2]((uint, uint, io::off), (uint, uint, io::off)), flags: flag, comment: str, require_int: bool, }; // Flags which apply to this lexer export type flag = enum uint { NONE = 0, // Enables lexing comments COMMENTS = 1 << 0, }; // Initializes a new lexer for the given [[bufio::scanner]]. The path is // borrowed. export fn init( in: *bufio::scanner, path: str, flags: flag = flag::NONE, ) lexer = { const loc = location { path = path, line = 1, col = 1, off = 0, }; return lexer { in = in, path = path, loc = (1, 1, 0), prevrloc = (1, 1, 0), un = (ltok::EOF, void, loc), prevunlocs = [((1, 1, 0), (1, 1, 0))...], flags = flags, ... }; }; export type restore_point = struct { off: io::off, state: lexer, }; // Saves the state of a [[lexer]], to be restored later with [[restore]]. The // underlying I/O source must be seekable. export fn save(lex: *lexer) (restore_point | io::error) = { return restore_point { off = io::tell(lex.in)?, state = *lex, }; }; // Restores a lexer to a state previously recorded with [[save]]. export fn restore(lex: *lexer, rp: *restore_point) (void | io::error) = { io::seek(lex.in, rp.off, io::whence::SET)?; *lex = rp.state; }; // Returns the current value of the comment buffer, or empty string if unset (or // if [[flag::COMMENTS]] was not enabled for this lexer). export fn comment(lex: *lexer) str = lex.comment; // Returns the next token from the lexer. export fn lex(lex: *lexer) (token | error) = { if (lex.un.0 != ltok::EOF) { defer lex.un.0 = ltok::EOF; return lex.un; }; defer { lex.prevunlocs[1] = lex.prevunlocs[0]; const prev = prevloc(lex); const loc = mkloc(lex); lex.prevunlocs[0] = ( (prev.line, prev.col, prev.off), (loc.line, loc.col, loc.off), ); }; let r: (rune, location) = ('\0', location { ... }); for (true) { r = match (nextw(lex)?) { case io::EOF => return (ltok::EOF, void, mkloc(lex)); case let r: (rune, location) => yield r; }; if (r.0 == '#') { lex_annotation(lex)?; } else { break; }; }; if (ascii::isdigit(r.0)) { unget(lex, r.0); return lex_literal(lex); }; lex.require_int = false; if (is_name(r.0, false)) { unget(lex, r.0); return lex_name(lex, r.1); }; let tok = switch (r.0) { case '"', '\'', '`' => unget(lex, r.0); return lex_rn_str(lex); case '.', '<', '>', '&', '|', '^' => unget(lex, r.0); return lex3(lex); case '*', '%', '/', '+', '-', ':', '!', '=' => unget(lex, r.0); return lex2(lex); case '~' => yield ltok::BNOT; case ',' => yield ltok::COMMA; case '{' => yield ltok::LBRACE; case '[' => yield ltok::LBRACKET; case '(' => yield ltok::LPAREN; case '}' => yield ltok::RBRACE; case ']' => yield ltok::RBRACKET; case ')' => yield ltok::RPAREN; case ';' => yield ltok::SEMICOLON; case '?' => yield ltok::QUESTION; case '$' => yield ltok::DOLLAR; case => return syntaxerr(r.1, "invalid character"); }; line_comment(lex)?; return (tok, void, r.1); }; fn is_name(r: rune, num: bool) bool = ascii::isalpha(r) || r == '_' || r == '@' || (num && ascii::isdigit(r)); fn lex_annotation(l: *lexer) (void | error) = { static let in_annotation = false; if (in_annotation) { return syntaxerr(mkloc(l), "cannot nest annotations"); }; in_annotation = true; defer in_annotation = false; match (next(l)?) { case let n: (rune, location) => const (r, l) = n; if (r != '[') { return syntaxerr(l, "invalid annotation; expected #[...]"); }; case io::EOF => return syntaxerr(mkloc(l), "invalid annotation; expected #[...]"); }; let id: []str = []; defer strings::freeall(id); let tokens = false; for (true) { let tok = lex(l)?; if (tok.0 != ltok::NAME) { return syntaxerr(tok.2, "invalid annotation, expected identifier"); }; append(id, tok.1 as str)!; tok = lex(l)?; switch (tok.0) { case ltok::LPAREN, ltok::RBRACKET => unlex(l, tok); break; case ltok::DOUBLE_COLON => yield; case => return syntaxerr(tok.2, "invalid annotation, expected identifier"); }; }; let tok = lex(l)?; switch (tok.0) { case ltok::RBRACKET => return; case ltok::LPAREN => yield; case => return syntaxerr(tok.2, "invalid annotation (expected '(' or ']')"); }; def STACKSZ: int = 32; let stack: [STACKSZ]ltok = [ltok::EOF...]; let stack = stack[..0]; let sp = 0; static append(stack, ltok::LPAREN)!; for (sp >= 0) { if (sp + 1 >= STACKSZ) { return syntaxerr(tok.2, "annotation depth exceeds token stack limit"); }; tok = lex(l)?; let want = ltok::EOF; switch (tok.0) { case ltok::LPAREN, ltok::LBRACKET, ltok::LBRACE => static append(stack, tok.0)!; sp += 1; case ltok::RPAREN => want = ltok::LPAREN; case ltok::RBRACKET => want = ltok::LBRACKET; case ltok::RBRACE => want = ltok::LBRACE; case => void; }; if (want != ltok::EOF) { let have = stack[sp]; static delete(stack[sp]); sp -= 1; if (have != want) { return syntaxerr(tok.2, "unbalanced tokens in annotation"); }; }; }; tok = lex(l)?; if (tok.0 != ltok::RBRACKET) { return syntaxerr(tok.2, "invalid annotation (expected ']')"); }; }; fn lex_unicode(lex: *lexer, loc: location, n: size) (rune | error) = { assert(n < 9); let buf: [8]u8 = [0...]; for (let i = 0z; i < n; i += 1z) { let r = match (next(lex)?) { case io::EOF => return syntaxerr(loc, "unexpected EOF scanning for escape"); case let r: (rune, location) => yield r.0; }; if (!ascii::isxdigit(r)) { return syntaxerr(loc, "unexpected rune scanning for escape"); }; buf[i] = r: u8; }; let s = strings::fromutf8_unsafe(buf[..n]); return strconv::stou32(s, strconv::base::HEX) as u32: rune; }; fn lex_rune(lex: *lexer, loc: location) (rune | error) = { let r = match (next(lex)?) { case io::EOF => return syntaxerr(loc, "unexpected EOF scanning for rune"); case let r: (rune, location) => yield r.0; }; if (r != '\\') { return r; }; r = match (next(lex)?) { case io::EOF => return syntaxerr(loc, "unexpected EOF scanning for escape"); case let r: (rune, location) => yield r.0; }; switch (r) { case '\\' => return '\\'; case '\'' => return '\''; case '0' => return '\0'; case 'a' => return '\a'; case 'b' => return '\b'; case 'f' => return '\f'; case 'n' => return '\n'; case 'r' => return '\r'; case 't' => return '\t'; case 'v' => return '\v'; case '"' => return '\"'; case 'x' => return lex_unicode(lex, loc, 2); case 'u' => return lex_unicode(lex, loc, 4); case 'U' => return lex_unicode(lex, loc, 8); case => return syntaxerr(mkloc(lex), "unknown escape sequence"); }; }; fn lex_string(lex: *lexer, loc: location, delim: rune) (token | error) = { let ret: token = (ltok::LIT_STR, "", loc); let buf = memio::dynamic(); for (true) match (next(lex)?) { case io::EOF => return syntaxerr(loc, "unexpected EOF scanning string literal"); case let r: (rune, location) => if (r.0 == delim) break else if (delim == '"' && r.0 == '\\') { unget(lex, r.0); let r = lex_rune(lex, loc)?; memio::appendrune(&buf, r)?; } else { memio::appendrune(&buf, r.0)?; }; }; for (true) match (nextw(lex)?) { case io::EOF => break; case let r: (rune, location) => switch (r.0) { case '"', '`' => const tok = lex_string(lex, loc, r.0)?; const next = tok.1 as str; memio::concat(&buf, next)!; free(next); break; case '/' => match (nextw(lex)?) { case io::EOF => unget(lex, r.0); case let s: (rune, location) => if (s.0 == '/') { lex_comment(lex)?; continue; } else { unget(lex, s.0); unget(lex, r.0); }; }; break; case => unget(lex, r.0); break; }; }; return (ltok::LIT_STR, memio::string(&buf)!, loc); }; fn lex_rn_str(lex: *lexer) (token | error) = { const loc = mkloc(lex); let r = match (next(lex)) { case let r: (rune, location) => yield r.0; case (io::EOF | io::error) => abort(); }; switch (r) { case '\'' => void; case '\"', '`' => return lex_string(lex, loc, r); case => abort(); // Invariant }; // Rune literal let ret: token = (ltok::LIT_RCONST, lex_rune(lex, loc)?, loc); match (next(lex)?) { case io::EOF => return syntaxerr(loc, "unexpected EOF"); case let n: (rune, location) => if (n.0 != '\'') { return syntaxerr(n.1, "expected \"\'\""); }; }; line_comment(lex)?; return ret; }; fn lex_name(lex: *lexer, loc: location) (token | error) = { let buf = memio::dynamic(); match (next(lex)) { case let r: (rune, location) => assert(is_name(r.0, false)); memio::appendrune(&buf, r.0)!; case (io::EOF | io::error) => abort(); }; for (true) match (next(lex)?) { case io::EOF => break; case let r: (rune, location) => if (!is_name(r.0, true)) { unget(lex, r.0); break; }; memio::appendrune(&buf, r.0)?; }; line_comment(lex)?; let n = memio::string(&buf)!; match (sort::search(bmap[..ltok::LAST_KEYWORD+1], size(str), &n, &cmp::strs)) { case void => return (ltok::NAME, n, loc); case let i: size => free(n); return (i: ltok, void, loc); }; }; fn line_comment(lex: *lexer) (void | error) = { if (lex.flags & flag::COMMENTS != flag::COMMENTS) { return; }; let r: (rune, location) = ('\0', location { ... }); for (true) match (try(lex, '\t', ' ', '/')?) { case void => return; case let v: (rune, location) => switch (v.0) { case '\t', ' ' => void; case '/' => r = v; break; case => abort(); // unreachable }; }; if (try(lex, '/')? is void) { unget(lex, r.0); return; }; free(lex.comment); lex.comment = ""; lex_comment(lex)?; }; fn lex_comment(lexr: *lexer) (void | error) = { if (lexr.flags & flag::COMMENTS != flag::COMMENTS) { for (true) match (next(lexr)?) { case io::EOF => break; case let r: (rune, location) => if (r.0 == '\n') { break; }; }; return; }; let buf = memio::dynamic(); defer io::close(&buf)!; for (true) match (next(lexr)?) { case io::EOF => break; case let r: (rune, location) => memio::appendrune(&buf, r.0)!; if (r.0 == '\n') { break; }; }; let bytes = strings::toutf8(lexr.comment); append(bytes, strings::toutf8(memio::string(&buf)!)...)!; lexr.comment = strings::fromutf8(bytes)!; }; fn lex_literal(lex: *lexer) (token | error) = { const loc = mkloc(lex); let chars: []u8 = []; let r = match (next(lex)?) { case io::EOF => return (ltok::EOF, void, loc); case let r: (rune, location) => yield r; }; let started = false; let base = strconv::base::DEC; if (r.0 == '0') { append(chars, utf8::encoderune(r.0)...)!; r = match (next(lex)?) { case io::EOF => return (ltok::LIT_ICONST, 0u64, loc); case let r: (rune, location) => yield r; }; switch (r.0) { case 'b' => base = strconv::base::BIN; case 'o' => base = strconv::base::OCT; case 'x' => base = strconv::base::HEX; case => if (ascii::isdigit(r.0)) { return syntaxerr(loc, "Leading zeros in number literals aren't permitted (for octal, use the 0o prefix instead)"); }; started = true; unget(lex, r.0); }; } else unget(lex, r.0); let basechrs = switch (base) { case strconv::base::BIN => yield "01"; case strconv::base::OCT => yield "01234567"; case strconv::base::DEC => yield "0123456789"; case strconv::base::HEX => yield "0123456789ABCDEFabcdef"; case => abort(); // unreachable }; let suff: (size | void) = void; let exp: (size | void) = void; let end = 0z; let float = false; let last_rune_was_separator = false; for (true) { r = match (next(lex)?) { case io::EOF => if (last_rune_was_separator) { return syntaxerr(loc, "Expected digit after separator"); }; break; case let r: (rune, location) => yield r; }; if (!strings::contains(basechrs, r.0)) { if (last_rune_was_separator) { return syntaxerr(loc, "Expected digit after separator"); }; switch (r.0) { case '.' => if (!started) { return syntaxerr(loc, "Expected integer literal"); }; if (float || exp is size || suff is size || lex.require_int) { unget(lex, r.0); break; } else { r = match (next(lex)?) { case io::EOF => break; case let r: (rune, location) => yield r; }; if (!strings::contains(basechrs, r.0)) { unget(lex, r.0); unget(lex, '.'); break; }; unget(lex, r.0); float = true; append(chars, utf8::encoderune('.')...)!; }; case 'e', 'E', 'p', 'P' => if (!started) { return syntaxerr(loc, "Expected integer literal"); }; if ((r.0 == 'e' || r.0 == 'E') != (base == strconv::base::DEC)) { unget(lex, r.0); break; }; if (exp is size || suff is size) { unget(lex, r.0); break; } else { if (end == 0) end = len(chars); append(chars, utf8::encoderune(r.0)...)!; exp = len(chars); r = match (next(lex)?) { case io::EOF => break; case let r: (rune, location) => yield r; }; switch (r.0) { case '+', '-' => append(chars, utf8::encoderune(r.0)...)!; case => unget(lex, r.0); }; basechrs = "0123456789"; }; case 'i', 'u', 'f', 'z' => if (!started) { return syntaxerr(loc, "Expected integer literal"); }; if (suff is size || r.0 != 'f' && float || r.0 == 'f' && base != strconv::base::DEC) { unget(lex, r.0); break; } else { suff = len(chars); if (end == 0) end = len(chars); append(chars, utf8::encoderune(r.0)...)!; basechrs = "0123456789"; }; case '_' => if (!started) { return syntaxerr(loc, "Expected integer literal"); }; if (exp is size) { return syntaxerr(loc, "Exponents may not contain separators"); }; if (suff is size) { return syntaxerr(loc, "Suffixes may not contain separators"); }; last_rune_was_separator = true; case => unget(lex, r.0); break; }; } else { last_rune_was_separator = false; append(chars, utf8::encoderune(r.0)...)!; }; started = true; }; if (!started) { return syntaxerr(loc, "expected integer literal"); }; if (end == 0) end = len(chars); lex.require_int = false; let exp = match (exp) { case void => yield "0"; case let exp: size => let end = match (suff) { case void => yield len(chars); case let suff: size => yield suff; }; yield strings::fromutf8(chars[exp..end])!; }; let exp = match (strconv::stoi(exp)) { case let exp: int => yield exp; case strconv::invalid => return syntaxerr(mkloc(lex), "expected exponent"); case strconv::overflow => return syntaxerr(loc, "overflow in exponent"); }; let floatend = match (suff) { case let suff: size => yield suff; case void => yield len(chars); }; let suff = match (suff) { case let suff: size => yield strings::fromutf8(chars[suff..])!; case void => yield ""; }; let (suff, signed) = if (suff == "u8") (ltok::LIT_U8, false) else if (suff == "u16") (ltok::LIT_U16, false) else if (suff == "u32") (ltok::LIT_U32, false) else if (suff == "u64") (ltok::LIT_U64, false) else if (suff == "u") (ltok::LIT_UINT, false) else if (suff == "z") (ltok::LIT_SIZE, false) else if (suff == "i8") (ltok::LIT_I8, true) else if (suff == "i16") (ltok::LIT_I16, true) else if (suff == "i32") (ltok::LIT_I32, true) else if (suff == "i64") (ltok::LIT_I64, true) else if (suff == "i") (ltok::LIT_INT, true) else if (suff == "" && !float && exp >= 0) (ltok::LIT_ICONST, false) else if (suff == "f32") (ltok::LIT_F32, false) else if (suff == "f64") (ltok::LIT_F64, false) else if (suff == "" && (float || exp < 0)) (ltok::LIT_FCONST, false) else return syntaxerr(loc, "invalid literal suffix"); let exp = if (exp < 0) switch (suff) { case ltok::LIT_F32, ltok::LIT_F64, ltok::LIT_FCONST => yield exp: size; case => return syntaxerr(loc, "invalid negative exponent of integer"); } else exp: size; let val = strings::fromutf8(chars[..end])!; let val = switch (suff) { case ltok::LIT_F32, ltok::LIT_F64, ltok::LIT_FCONST => val = strings::fromutf8(chars[..floatend])!; yield strconv::stof64(val, base); case => yield strconv::stou64(val, base); }; let val = match (val) { case let val: u64 => for (let i = 0z; i < exp; i += 1) { let old = val; val *= 10; if (val / 10 != old) { return syntaxerr(loc, "overflow in exponent"); }; }; if (signed && val > types::I64_MIN: u64) { return syntaxerr(loc, "overflow in exponent"); }; yield val; case let val: f64 => yield val; case strconv::invalid => abort(); // Shouldn't be lexed in case strconv::overflow => return syntaxerr(loc, "literal overflow"); }; line_comment(lex)?; return (suff, val, loc); }; fn lex2(lexr: *lexer) (token | error) = { let first = next(lexr)? as (rune, location); let tok: (ltok, [](rune, ltok)) = switch (first.0) { case '*' => yield (ltok::TIMES, [('=', ltok::TIMESEQ)]); case '%' => yield (ltok::MODULO, [('=', ltok::MODEQ)]); case '/' => match (next(lexr)?) { case let r: (rune, location) => switch (r.0) { case '=' => line_comment(lexr)?; return (ltok::DIVEQ, void, first.1); case '/' => lex_comment(lexr)?; return lex(lexr); case => unget(lexr, r.0); return (ltok::DIV, void, first.1); }; case io::EOF => return (ltok::DIV, void, first.1); }; case '+' => yield (ltok::PLUS, [('=', ltok::PLUSEQ)]); case '-' => yield (ltok::MINUS, [('=', ltok::MINUSEQ)]); case ':' => yield (ltok::COLON, [(':', ltok::DOUBLE_COLON)]); case '!' => yield (ltok::LNOT, [('=', ltok::NEQUAL)]); case '=' => yield (ltok::EQUAL, [('=', ltok::LEQUAL), ('>', ltok::ARROW)]); case => return syntaxerr(first.1, "unknown token sequence"); }; match (next(lexr)?) { case let r: (rune, location) => for (let i = 0z; i < len(tok.1); i += 1) { if (tok.1[i].0 == r.0) { line_comment(lexr)?; return (tok.1[i].1, void, first.1); }; }; unget(lexr, r.0); line_comment(lexr)?; case io::EOF => void; }; return (tok.0, void, first.1); }; fn lex3(lex: *lexer) (token | error) = { let r = next(lex)? as (rune, location); let toks = switch (r.0) { case '.' => let tok = if (try(lex, '.')? is void) { lex.require_int = true; yield ltok::DOT; } else if (try(lex, '.')? is void) { yield ltok::DOUBLE_DOT; } else ltok::ELLIPSIS; line_comment(lex)?; return (tok, void, r.1); case '<' => yield [ltok::LESS, ltok::LESSEQ, ltok::LSHIFT, ltok::LSHIFTEQ]; case '>' => yield [ltok::GT, ltok::GTEQ, ltok::RSHIFT, ltok::RSHIFTEQ]; case '&' => yield [ltok::BAND, ltok::BANDEQ, ltok::LAND, ltok::LANDEQ]; case '|' => yield [ltok::BOR, ltok::BOREQ, ltok::LOR, ltok::LOREQ]; case '^' => yield [ltok::BXOR, ltok::BXOREQ, ltok::LXOR, ltok::LXOREQ]; case => return syntaxerr(r.1, "unknown token sequence"); }; let idx = match (try(lex, r.0, '=')?) { case void => yield 0; // X case let n: (rune, location) => yield switch (n.0) { case '=' => yield 1; // X= case => yield match (try(lex, '=')?) { case void => yield 2; // XX case (rune, location) => yield 3; // XX= }; }; }; line_comment(lex)?; return (toks[idx], void, r.1); }; // Unlex a single token. The next call to [[lex]] will return this token. Only one // unlex is supported at a time; you must call [[lex]] before calling [[unlex]] // again. export fn unlex(lex: *lexer, tok: token) void = { assert(lex.un.0 == ltok::EOF, "attempted to unlex more than one token"); lex.un = tok; }; // Lexes a sequence of "balanced" tokens and discards them. export fn skip_balanced(l: *lexer) (void | error) = { def STACKSZ: int = 256; let stack: [STACKSZ]ltok = [ltok::EOF...]; let stack = stack[..0]; let sp = 0; let tok = lex(l)?; switch (tok.0) { case ltok::LPAREN, ltok::LBRACKET, ltok::LBRACE => static append(stack, tok.0)!; case => return syntaxerr(tok.2, "Expected a 'balanced' token"); }; for (sp >= 0) { if (sp + 1 >= STACKSZ) { return syntaxerr(tok.2, "depth exceeds token stack limit"); }; tok = lex(l)?; let want = ltok::EOF; switch (tok.0) { case ltok::LPAREN, ltok::LBRACKET, ltok::LBRACE => static append(stack, tok.0)!; sp += 1; case ltok::RPAREN => want = ltok::LPAREN; case ltok::RBRACKET => want = ltok::LBRACKET; case ltok::RBRACE => want = ltok::LBRACE; case => void; }; if (want != ltok::EOF) { let have = stack[sp]; static delete(stack[sp]); sp -= 1; if (have != want) { return syntaxerr(tok.2, "unbalanced tokens"); }; }; }; }; fn next(lex: *lexer) ((rune, location) | syntax | io::EOF | io::error) = { match (bufio::scan_rune(lex.in)) { case let e: (io::EOF | io::error) => return e; case let r: rune => const loc = mkloc(lex); lexloc(lex, r); return (r, loc); case utf8::invalid => return syntaxerr(mkloc(lex), "Source file is not valid UTF-8"); }; }; fn nextw(lex: *lexer) ((rune, location) | io::EOF | error) = { for (true) match (next(lex)?) { case io::EOF => return io::EOF; case let r: (rune, location) => if (ascii::isspace(r.0)) { if (r.0 == '\n') { free(lex.comment); lex.comment = ""; }; continue; }; if (!is_name(r.0, true) && r.0 != '/') { free(lex.comment); lex.comment = ""; }; return r; }; }; fn try( lex: *lexer, want: rune... ) ((rune, location) | syntax | void | io::error) = { let r = match (next(lex)?) { case io::EOF => return; case let r: (rune, location) => yield r; }; assert(len(want) > 0); for (let i = 0z; i < len(want); i += 1) { if (r.0 == want[i]) { return r; }; }; unget(lex, r.0); }; fn unget(lex: *lexer, r: rune) void = { bufio::unreadrune(lex.in, r); // here, we set the current location to the previous location, then // subtract one from the previous location's column. this is always // correct, even for tabs and newlines, since a tab or newline will // never be ungot after a previous unget call. besides tabs and // newlines, the rune will always be a printable ASCII character assert(ascii::isprint(r) || r == '\t' || r == '\n'); assert(r != '\n' || lex.prevrloc.0 == lex.loc.0 - 1); lex.loc = lex.prevrloc; lex.prevrloc.1 -= 1; lex.prevrloc.2 -= 1; }; fn lexloc(lex: *lexer, r: rune) void = { lex.prevrloc = lex.loc; lex.loc.2 += 1; switch (r) { case '\n' => lex.loc.0 += 1; lex.loc.1 = 1; case '\t' => lex.loc.1 += 8 - lex.loc.1 % 8 + 1; case => lex.loc.1 += 1; }; }; export fn mkloc(lex: *lexer) location = { const loc = if (lex.un.0 == ltok::EOF) lex.loc else ( lex.un.2.line, lex.un.2.col, lex.un.2.off, ); return location { path = lex.path, line = loc.0, col = loc.1, off = loc.2, }; }; export fn prevloc(lex: *lexer) location = { const loc = if (lex.un.0 == ltok::EOF) lex.prevrloc else lex.prevunlocs[1].0; return location { path = lex.path, line = loc.0, col = loc.1, off = loc.2, }; }; export fn syntaxerr(loc: location, why: str) error = { static let buf = path::buffer{...}; path::set(&buf, loc.path)!; loc.path = path::string(&buf); return (loc, why); }; hare-update-0.25.2.0/v0_25_2/parse/000077500000000000000000000000001503370650000163225ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/parse/README000066400000000000000000000006731503370650000172100ustar00rootroot00000000000000hare::parse provides a parser for Hare source code. The [[subunit]] function will parse a single Hare source file and return a [[hare::ast::subunit]]. Other functions provide parsers for various important Hare sub-terminals, such as [[decls]] and [[imports]]. See the Hare specification for more details: https://harelang.org/specification Most of these functions require the caller to provide a Hare lexer, see [[hare::lex::]] for details. hare-update-0.25.2.0/v0_25_2/parse/decl.ha000066400000000000000000000135041503370650000175460ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use ascii; use common; use common::{ltok, nonterminal}; use strings; use v0_25_2::ast; use v0_25_2::lex; fn attr_symbol(lexer: *lex::lexer) (str | error) = { want(lexer, ltok::LPAREN)?; let t = want(lexer, ltok::LIT_STR)?; let s = t.1 as str; let d = strings::iter(s); match (strings::next(&d)) { case done => void; case let r: rune => synassert(t.2, ascii::isalpha(r) || r == '.' || r == '_', "Invalid symbol")?; }; for (let r => strings::next(&d)) { synassert(t.2, ascii::isalnum(r) || r == '$' || r == '.' || r == '_', "Invalid symbol")?; }; want(lexer, ltok::RPAREN)?; return s; }; // Parses a command-line definition export fn define(lexer: *lex::lexer) (ast::decl_const | error) = { const ident = ident(lexer)?; const _type: nullable *ast::_type = match (try(lexer, ltok::COLON)?) { case common::token => yield alloc(_type(lexer)?)!; case void => yield null; }; want(lexer, ltok::EQUAL)?; const init: *ast::expr = alloc(expr(lexer)?)!; return ast::decl_const { ident = ident, _type = _type, init = init, }; }; fn decl_const( lexer: *lex::lexer, tok: ltok, ) ([]ast::decl_const | error) = { let decl: []ast::decl_const = []; for (true) { append(decl, define(lexer)?)!; if (try(lexer, ltok::COMMA)? is void) { break; }; }; return decl; }; fn decl_global( lexer: *lex::lexer, tok: ltok, ) ([]ast::decl_global | error) = { let decl: []ast::decl_global = []; for (true) { const (symbol, threadlocal) = match (try(lexer, ltok::ATTR_SYMBOL, ltok::ATTR_THREADLOCAL)?) { case void => yield ("", false); case let t: common::token => yield if (t.0 == ltok::ATTR_SYMBOL) { yield (attr_symbol(lexer)?, false); } else { yield ("", true); }; }; const ident = ident(lexer)?; const _type: nullable *ast::_type = match (try(lexer, ltok::COLON)?) { case common::token => yield alloc(_type(lexer)?)!; case void => yield null; }; const init: nullable *ast::expr = match (try(lexer, ltok::EQUAL)?) { case common::token => yield alloc(expr(lexer)?)!; case void => yield null; }; const btok = try(lexer, ltok::COMMA)?; append(decl, ast::decl_global { is_const = tok == ltok::CONST, is_threadlocal = threadlocal, symbol = symbol, ident = ident, _type = _type, init = init, })!; if (btok is void) { break; }; }; return decl; }; fn decl_type(lexer: *lex::lexer) ([]ast::decl_type | error) = { let decl: []ast::decl_type = []; for (true) { let ident = ident(lexer)?; want(lexer, ltok::EQUAL)?; let _type = _type(lexer)?; let btok = try(lexer, ltok::COMMA)?; append(decl, ast::decl_type { ident = ident, _type = alloc(_type)!, })!; if (btok is void) { break; }; }; return decl; }; fn decl_func(lexer: *lex::lexer) (ast::decl_func | error) = { on(lexer, nonterminal::FUNCTION_DECLARATION, null)?; let attr = ast::fndecl_attr::NONE, sym = ""; const attrs = [ ltok::ATTR_FINI, ltok::ATTR_INIT, ltok::ATTR_TEST, ltok::ATTR_SYMBOL ]; for (true) match (try(lexer, attrs...)?) { case void => break; case let t: common::token => synassert(t.2, t.0 == ltok::ATTR_SYMBOL || attr == 0, "Only one of @init, @fini, or @test may be provided")?; switch (t.0) { case ltok::ATTR_FINI => attr = ast::fndecl_attr::FINI; case ltok::ATTR_INIT => attr = ast::fndecl_attr::INIT; case ltok::ATTR_TEST => attr = ast::fndecl_attr::TEST; case ltok::ATTR_SYMBOL => sym = attr_symbol(lexer)?; case => abort("unreachable"); }; }; want(lexer, ltok::FN)?; let ident_loc = lex::mkloc(lexer); let ident = ident(lexer)?; let proto_start = lex::mkloc(lexer); let prototype = prototype(lexer)?; let proto_end = lex::prevloc(lexer); let tok = want(lexer, ltok::EQUAL, ltok::SEMICOLON)?; let body = switch (tok.0) { case ltok::EQUAL => for (let param &.. prototype.params) { synassert(param.loc, len(param.name) > 0, "Expected parameter name in function declaration")?; }; yield alloc(expr(lexer)?)!; case ltok::SEMICOLON => lex::unlex(lexer, tok); yield null; case => abort(); // unreachable }; return ast::decl_func { symbol = sym, ident = ident, prototype = alloc(ast::_type { start = proto_start, end = proto_end, flags = 0, repr = prototype, })!, body = body, attrs = attr, }; }; // Parses a declaration. export fn decl(lexer: *lex::lexer) (ast::decl | error) = { const start = lex::mkloc(lexer); let comment = ""; if (try(lexer, ltok::STATIC)? is common::token) { comment = strings::dup(lex::comment(lexer))!; let expr = assert_expr(lexer, true)?; want(lexer, ltok::SEMICOLON)?; return ast::decl { exported = false, start = start, end = expr.end, decl = expr.expr as ast::assert_expr, docs = comment, }; }; let exported = match (try(lexer, ltok::EXPORT)?) { case void => yield false; case common::token => comment = strings::dup(lex::comment(lexer))!; yield true; }; const toks = [ltok::CONST, ltok::LET, ltok::DEF, ltok::TYPE]; const next = try(lexer, toks...)?; if (comment == "") { comment = strings::dup(lex::comment(lexer))!; }; let decl = match (next) { case void => yield decl_func(lexer)?; case let t: common::token => yield switch (t.0) { case ltok::TYPE => yield decl_type(lexer)?; case ltok::LET, ltok::CONST => yield decl_global(lexer, t.0)?; case ltok::DEF => yield decl_const(lexer, t.0)?; case => abort(); }; }; want(lexer, ltok::SEMICOLON)?; return ast::decl { exported = exported, start = start, end = lex::mkloc(lexer), decl = decl, docs = comment, }; }; // Parses the declarations for a sub-unit. export fn decls(lexer: *lex::lexer) ([]ast::decl | error) = { let decls: []ast::decl = []; for (true) { if (peek(lexer, ltok::EOF)? is common::token) break; append(decls, decl(lexer)?)!; }; return decls; }; hare-update-0.25.2.0/v0_25_2/parse/error.ha000066400000000000000000000010731503370650000177660ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use fmt; use v0_25_2::lex; // All possible error types. export type error = !common::error; // Convert an error into a human-friendly string. The result may be statically // allocated. export fn strerror(err: error) const str = common::strerror(err: common::error); fn syntaxerr( loc: common::location, fmt: str, args: fmt::field... ) common::error = { static let buf: [4096]u8 = [0...]; let why = fmt::bsprintf(buf, fmt, args...)!; return lex::syntaxerr(loc, why); }; hare-update-0.25.2.0/v0_25_2/parse/expr.ha000066400000000000000000001063101503370650000176130ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use common::{token, ltok, nonterminal}; use math; use strings; use types; use v0_25_2::ast; use v0_25_2::lex; // Parses an expression. export fn expr(lexer: *lex::lexer) (ast::expr | error) = { const loc = lex::mkloc(lexer); // All assignment-op tokens const atoks: []ltok = [ ltok::EQUAL, ltok::BANDEQ, ltok::BOREQ, ltok::BXOREQ, ltok::DIVEQ, ltok::LANDEQ, ltok::LOREQ, ltok::LXOREQ, ltok::LSHIFTEQ, ltok::MINUSEQ, ltok::MODEQ, ltok::PLUSEQ, ltok::RSHIFTEQ, ltok::TIMESEQ, ]; const ex = match (peek(lexer, ltok::IF, ltok::FOR, ltok::BREAK, ltok::CONTINUE, ltok::RETURN, ltok::YIELD)?) { case void => yield binarithm(lexer, void, 0)?; case let tok: common::token => yield switch (tok.0) { case ltok::IF => yield if_expr(lexer)?; case ltok::FOR => yield for_expr(lexer)?; case ltok::BREAK, ltok::CONTINUE, ltok::RETURN => yield control(lexer)?; case ltok::YIELD => yield yield_expr(lexer)?; case => abort(); // Invariant }; }; const tok = match (try(lexer, atoks...)?) { case let tok: common::token => yield tok; case => return ex; }; const is_obj_selector = match (ex.expr) { case (ast::access_expr | ast::slice_expr) => yield true; case let ex: ast::unarithm_expr => yield ex.op == ast::unarithm_op::DEREF; case => yield false; }; synassert(lex::mkloc(lexer), is_obj_selector, "Expected an object-selector, pointer dereference, or slice for assignment target")?; const ex = ast::assign_expr { op = switch (tok.0) { case ltok::EQUAL => yield void; case ltok::BANDEQ => yield ast::binarithm_op::BAND; case ltok::BOREQ => yield ast::binarithm_op::BOR; case ltok::BXOREQ => yield ast::binarithm_op::BXOR; case ltok::DIVEQ => yield ast::binarithm_op::DIV; case ltok::LANDEQ => yield ast::binarithm_op::LAND; case ltok::LOREQ => yield ast::binarithm_op::LOR; case ltok::LSHIFTEQ => yield ast::binarithm_op::LSHIFT; case ltok::LXOREQ => yield ast::binarithm_op::LXOR; case ltok::MINUSEQ => yield ast::binarithm_op::MINUS; case ltok::MODEQ => yield ast::binarithm_op::MODULO; case ltok::PLUSEQ => yield ast::binarithm_op::PLUS; case ltok::RSHIFTEQ => yield ast::binarithm_op::RSHIFT; case ltok::TIMESEQ => yield ast::binarithm_op::TIMES; case => abort(); // unreachable }, object = alloc(ex)!, value = alloc(expr(lexer)?)!, }; return ast::expr { start = loc, end = lex::prevloc(lexer), expr = ex, }; }; fn assert_expr(lexer: *lex::lexer, is_static: bool) (ast::expr | error) = { const tok = want(lexer, ltok::ABORT, ltok::ASSERT)?; let expr = switch (tok.0) { case ltok::ABORT => want(lexer, ltok::LPAREN)?; const msg: nullable *ast::expr = match (peek(lexer, ltok::RPAREN)?) { case common::token => yield null; case => yield alloc(expr(lexer)?)!; }; want(lexer, ltok::RPAREN)?; yield ast::assert_expr { cond = null, message = msg, is_static = is_static, }; case ltok::ASSERT => want(lexer, ltok::LPAREN)?; const cond: nullable *ast::expr = alloc(expr(lexer)?)!; const msg: nullable *ast::expr = match (try(lexer, ltok::COMMA)?) { case common::token => yield alloc(expr(lexer)?)!; case => yield null; }; want(lexer, ltok::RPAREN)?; yield ast::assert_expr { cond = cond, message = msg, is_static = is_static, }; case => abort(); // unreachable }; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn alloc_expr(lexer: *lex::lexer) (ast::expr | error) = { on(lexer, nonterminal::ALLOC_EXPRESSION, null)?; const start = want(lexer, ltok::ALLOC)?; want(lexer, ltok::LPAREN)?; const init = alloc(expr(lexer)?)!; const expr = switch (want(lexer, ltok::COMMA, ltok::ELLIPSIS, ltok::RPAREN)?.0) { case ltok::COMMA => const capacity = alloc(expr(lexer)?)!; want(lexer, ltok::RPAREN)?; yield ast::alloc_expr { init = init, form = ast::alloc_form::COPY, capacity = capacity, }; case ltok::ELLIPSIS => want(lexer, ltok::RPAREN)?; yield ast::alloc_expr { init = init, form = ast::alloc_form::COPY, capacity = null, }; case ltok::RPAREN => yield ast::alloc_expr { init = init, form = ast::alloc_form::OBJECT, capacity = null, }; case => abort(); // unreachable }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = expr, }; }; fn append_insert_expr( lexer: *lex::lexer, is_static: bool, ) (ast::expr | error) = { const tok = peek(lexer, ltok::APPEND, ltok::INSERT)? as token; switch (tok.0) { case ltok::APPEND => on(lexer, nonterminal::APPEND_EXPRESSION, null)?; case ltok::INSERT => on(lexer, nonterminal::INSERT_EXPRESSION, null)?; case => abort(); }; const tok = want(lexer, ltok::APPEND, ltok::INSERT)?; want(lexer, ltok::LPAREN)?; const object = if (tok.0 == ltok::APPEND) objsel(lexer)? else idxexpr(lexer)?; want(lexer, ltok::COMMA)?; const value = expr(lexer)?; let length: nullable *ast::expr = null; let variadic = false; match (try(lexer, ltok::COMMA, ltok::ELLIPSIS)?) { case let tok: common::token => switch (tok.0) { case ltok::COMMA => length = alloc(expr(lexer)?)!; case ltok::ELLIPSIS => variadic = true; case => abort(); }; case void => void; }; want(lexer, ltok::RPAREN)?; let expr = ast::append_expr { object = alloc(object)!, value = alloc(value)!, length = length, variadic = variadic, is_static = is_static, }; const expr = if (tok.0 == ltok::INSERT) { yield expr: ast::insert_expr; } else expr; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn measurement(lexer: *lex::lexer) (ast::expr | error) = { const tok = want(lexer, ltok::LEN, ltok::ALIGN, ltok::SIZE, ltok::OFFSET)?; want(lexer, ltok::LPAREN)?; const expr = switch (tok.0) { case ltok::LEN => yield alloc(expr(lexer)?)!: ast::len_expr; case ltok::ALIGN => yield alloc(_type(lexer)?)!: ast::align_expr; case ltok::SIZE => yield alloc(_type(lexer)?)!: ast::size_expr; case ltok::OFFSET => yield alloc(expr(lexer)?)!: ast::offset_expr; case => abort(); // unreachable }; want(lexer, ltok::RPAREN)?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn binarithm( lexer: *lex::lexer, lvalue: (ast::expr | void), i: int, ) (ast::expr | error) = { // Precedence climbing parser // https://en.wikipedia.org/wiki/Operator-precedence_parser let lvalue = match (lvalue) { case void => yield cast(lexer, void)?; case let expr: ast::expr => yield expr; }; let tok = lex::lex(lexer)?; for (let j = precedence(tok); j >= i; j = precedence(tok)) { const op = binop_for_tok(tok); let rvalue = cast(lexer, void)?; tok = lex::lex(lexer)?; for (let k = precedence(tok); k > j; k = precedence(tok)) { lex::unlex(lexer, tok); rvalue = binarithm(lexer, rvalue, k)?; tok = lex::lex(lexer)?; }; const expr = ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = ast::binarithm_expr { op = op, lvalue = alloc(lvalue)!, rvalue = alloc(rvalue)!, }, }; lvalue = expr; }; lex::unlex(lexer, tok); return lvalue; }; fn binding_unpack(lexer: *lex::lexer) (ast::binding_unpack | error) = { let fields: ast::binding_unpack = []; for (true) { const (tok, value, _) = want(lexer, ltok::NAME, ltok::UNDERSCORE)?; if (tok == ltok::UNDERSCORE) { append(fields, void)!; } else { append(fields, value as str)!; }; if (len(fields) == 1) { want(lexer, ltok::COMMA)?; } else { match (try(lexer, ltok::COMMA)?) { case void => break; case common::token => void; }; }; }; want(lexer, ltok::RPAREN)?; return fields; }; fn binding(lexer: *lex::lexer, is_static: bool) (ast::expr | error) = { const loc = lex::mkloc(lexer); const tok = want(lexer, ltok::DEF, ltok::CONST, ltok::LET)?.0; const kind = switch (tok) { case ltok::DEF => assert(!is_static); yield ast::binding_kind::DEF; case ltok::CONST => yield ast::binding_kind::CONST; case ltok::LET => yield ast::binding_kind::LET; case => abort(); // unreachable }; let bindings: []ast::binding = []; for (true) { const (tok, value, _) = want(lexer, ltok::NAME, ltok::LPAREN)?; const name = switch (tok) { case ltok::NAME => yield value as str; case ltok::LPAREN => if (kind == ast::binding_kind::DEF) { return syntaxerr(lex::mkloc(lexer), "Can't use tuple unpacking with def"); }; yield binding_unpack(lexer)?; case => abort(); }; const btype: nullable *ast::_type = if (try(lexer, ltok::COLON)? is common::token) { yield alloc(_type(lexer)?)!; } else null; want(lexer, ltok::EQUAL)?; const init = alloc(expr(lexer)?)!; append(bindings, ast::binding { name = name, _type = btype, init = init, })!; match (try(lexer, ltok::COMMA)?) { case void => break; case common::token => void; }; }; return ast::expr { start = loc, end = lex::prevloc(lexer), expr = ast::binding_expr { is_static = is_static, kind = kind, bindings = bindings, }, }; }; fn builtin(lexer: *lex::lexer) (ast::expr | error) = { const tok = match (peek(lexer, ltok::ALIGN, ltok::ALLOC, ltok::APPEND, ltok::FREE, ltok::DELETE, ltok::ABORT, ltok::ASSERT, ltok::INSERT, ltok::STATIC, ltok::SIZE, ltok::LEN, ltok::OFFSET, ltok::VASTART, ltok::VAARG, ltok::VAEND)?) { case let tok: common::token => yield tok; case void => return plain_expression(lexer); }; switch (tok.0) { case ltok::ALLOC => return alloc_expr(lexer); case ltok::APPEND, ltok::INSERT => return append_insert_expr(lexer, false); case ltok::DELETE => return delete_expr(lexer, false); case ltok::FREE => return free_expr(lexer); case ltok::ABORT, ltok::ASSERT => return assert_expr(lexer, false); case ltok::STATIC => want(lexer, ltok::STATIC)!; return static_expr(lexer); case ltok::ALIGN, ltok::SIZE, ltok::LEN, ltok::OFFSET => return measurement(lexer); case ltok::VASTART => want(lexer, ltok::VASTART)?; want(lexer, ltok::LPAREN)?; want(lexer, ltok::RPAREN)?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = void: ast::vastart_expr: ast::variadic_expr, }; case ltok::VAARG => want(lexer, ltok::VAARG)?; want(lexer, ltok::LPAREN)?; const ap = alloc(objsel(lexer)?)!; want(lexer, ltok::COMMA)?; const _type = alloc(_type(lexer)?)!; want(lexer, ltok::RPAREN)?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = ast::vaarg_expr { ap = ap, _type = _type, }, }; case ltok::VAEND => want(lexer, ltok::VAEND)?; want(lexer, ltok::LPAREN)?; const expr = alloc(objsel(lexer)?)!; want(lexer, ltok::RPAREN)?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr: ast::vaend_expr: ast::variadic_expr, }; case => abort(); // Invariant }; }; export fn call(lexer: *lex::lexer, lvalue: ast::expr) (ast::expr | error) = { on(lexer, nonterminal::CALL_EXPRESSION, &lvalue)?; want(lexer, ltok::LPAREN)?; let args: []*ast::expr = []; let variadic = false; for (true) { match (try(lexer, ltok::RPAREN)?) { case common::token => break; case void => void; }; append(args, alloc(expr(lexer)?)!)!; match (try(lexer, ltok::ELLIPSIS)?) { case common::token => variadic = true; want(lexer, ltok::RPAREN)?; break; case void => void; }; switch (want(lexer, ltok::COMMA, ltok::RPAREN)?.0) { case ltok::RPAREN => break; case => void; }; }; return ast::expr { start = lvalue.start, end = lex::mkloc(lexer), expr = ast::call_expr { lvalue = alloc(lvalue)!, variadic = variadic, args = args, }, }; }; fn cast(lexer: *lex::lexer, lvalue: (ast::expr | void)) (ast::expr | error) = { const lvalue = match (lvalue) { case void => yield unarithm(lexer)?; case let e: ast::expr => yield e; }; const tok = match (try(lexer, ltok::COLON, ltok::AS, ltok::IS)?) { case void => return lvalue; case let tok: common::token => yield tok.0; }; const kind = switch (tok) { case ltok::COLON => yield ast::cast_kind::CAST; case ltok::AS => yield ast::cast_kind::ASSERTION; case ltok::IS => yield ast::cast_kind::TEST; case => abort(); }; let typ = match (try(lexer, ltok::NULL)?) { case let t: common::token => yield alloc(ast::_type { start = t.2, end = lex::prevloc(lexer), flags = 0, repr = ast::builtin_type::NULL, })!; case void => yield alloc(_type(lexer)?)!; }; return cast(lexer, ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = ast::cast_expr { kind = kind, value = alloc(lvalue)!, _type = typ, }, })?; }; fn literal(lexer: *lex::lexer) (ast::expr | error) = { const tok = want(lexer)?; const expr: ast::literal_expr = switch (tok.0) { case ltok::LIT_RCONST, ltok::LIT_STR => yield tok.1 as (rune | str); case ltok::LIT_U8, ltok::LIT_U16, ltok::LIT_U32, ltok::LIT_U64, ltok::LIT_UINT, ltok::LIT_SIZE => yield ast::number_literal { suff = tok.0, value = tok.1 as u64, sign = false, }; case ltok::LIT_I8, ltok::LIT_I16, ltok::LIT_I32, ltok::LIT_I64, ltok::LIT_INT => const n = tok.1 as u64; yield ast::number_literal { suff = tok.0, value = n: i64, sign = false, }; case ltok::LIT_ICONST => const n = tok.1 as u64; yield ast::number_literal { suff = tok.0, value = if (n <= types::I64_MAX: u64) n: i64 else n, sign = false, }; case ltok::LIT_F32, ltok::LIT_F64, ltok::LIT_FCONST => yield ast::number_literal { suff = tok.0, value = tok.1 as f64, sign = false, }; case ltok::VOID => yield void; case ltok::NOMEM => yield nomem; case ltok::DONE => yield done; case ltok::TRUE => yield true; case ltok::FALSE => yield false; case ltok::NULL => yield ast::_null; case => return syntaxerr(lex::mkloc(lexer), "Expected literal expression"); }; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn control(lexer: *lex::lexer) (ast::expr | error) = { let tok = want(lexer, ltok::BREAK, ltok::CONTINUE, ltok::RETURN)?; let label = if (tok.0 == ltok::BREAK || tok.0 == ltok::CONTINUE) { yield match (try(lexer, ltok::COLON)?) { case common::token => yield want(lexer, ltok::NAME)?.1 as str; case void => yield ""; }; } else ""; const expr = switch (tok.0) { case ltok::BREAK => yield label: ast::break_expr; case ltok::CONTINUE => yield label: ast::continue_expr; case ltok::RETURN => yield match (peek(lexer, ltok::COMMA, ltok::ELSE, ltok::RBRACE, ltok::RBRACKET, ltok::RPAREN, ltok::SEMICOLON, ltok::EOF)?) { case void => yield alloc(expr(lexer)?)!: ast::return_expr; case common::token => yield null: ast::return_expr; }; case => abort(); // unreachable }; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn delete_expr(lexer: *lex::lexer, is_static: bool) (ast::expr | error) = { const start = want(lexer, ltok::DELETE)?; want(lexer, ltok::LPAREN)?; const expr = alloc(postfix(lexer, void)?)!; // TODO: Assert that this was an indexing expression want(lexer, ltok::RPAREN)?; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::delete_expr { object = expr, is_static = is_static, }, }; }; fn compound_expr(lexer: *lex::lexer) (ast::expr | error) = { let items: []*ast::expr = []; const start = want(lexer, ltok::LBRACE, ltok::COLON)?; const label = switch (start.0) { case ltok::COLON => const tok = want(lexer, ltok::NAME)?; want(lexer, ltok::LBRACE)?; yield tok.1 as str; case => yield ""; }; for (true) { append(items, alloc(stmt(lexer)?)!)!; if (try(lexer, ltok::RBRACE)? is common::token) { break; }; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::compound_expr { exprs = items, label = label, }, }; }; fn stmt(lexer: *lex::lexer) (ast::expr | error) = { const expr = match (try(lexer, ltok::DEFER, ltok::DEF, ltok::LET, ltok::CONST, ltok::STATIC)?) { case let tok: common::token => yield switch (tok.0) { case ltok::DEFER => let expr = alloc(expr(lexer)?)!; yield ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr: ast::defer_expr, }; case ltok::DEF, ltok::CONST, ltok::LET => lex::unlex(lexer, tok); yield binding(lexer, false)?; case ltok::STATIC => yield match (peek(lexer, ltok::LET, ltok::CONST)?) { case common::token => yield binding(lexer, true)?; case void => yield static_expr(lexer)?; }; case => abort(); // unreachable }; case void => yield expr(lexer)?; }; want(lexer, ltok::SEMICOLON)?; return expr; }; fn for_expr(lexer: *lex::lexer) (ast::expr | error) = { const tok = want(lexer, ltok::FOR)?; const label = if (try(lexer, ltok::COLON)? is common::token) { const tok = want(lexer, ltok::NAME)?; yield tok.1 as str; } else ""; want(lexer, ltok::LPAREN)?; let kind = void: (ast::for_kind | void); let predicate_loc = lex::mkloc(lexer); const bindings = match (try(lexer, ltok::LET, ltok::CONST)?) { case let tok: common::token => const binding_kind = switch (tok.0) { case ltok::LET => yield ast::binding_kind::LET; case ltok::CONST => yield ast::binding_kind::CONST; case => abort(); // unreachable }; let bindings: []ast::binding = []; for (true) { const (tok, value, _) = want(lexer, ltok::NAME, ltok::LPAREN)?; const binding_name = switch (tok) { case ltok::NAME => yield value as str; case ltok::LPAREN => yield binding_unpack(lexer)?; case => abort(); // unreachable }; const btype: nullable *ast::_type = if (try(lexer, ltok::COLON)? is common::token) { yield alloc(_type(lexer)?)!; } else null; const (tok, _, _) = want(lexer, ltok::EQUAL, ltok::DOUBLE_DOT, ltok::BAND, ltok::ARROW)?; if (kind is void) { switch (tok) { case ltok::EQUAL => kind = ast::for_kind::ACCUMULATOR; case ltok::DOUBLE_DOT => kind = ast::for_kind::EACH_VALUE; case ltok::BAND => want(lexer, ltok::DOUBLE_DOT)?; kind = ast::for_kind::EACH_POINTER; case ltok::ARROW => kind = ast::for_kind::ITERATOR; case => abort(); // unreachable }; } else if (kind as ast::for_kind != ast::for_kind::ACCUMULATOR || tok != ltok::EQUAL) { return syntaxerr(lex::mkloc(lexer), "Cannot create multiple bindings in non-c-style loop"); }; const init_expr = alloc(expr(lexer)?)!; append(bindings, ast::binding { name = binding_name, _type = btype, init = init_expr, })!; match (try(lexer, ltok::COMMA)?) { case common::token => void; case void => break; }; }; if (kind as ast::for_kind == ast::for_kind::ACCUMULATOR) { want(lexer, ltok::SEMICOLON)?; }; yield alloc(ast::expr { start = predicate_loc, end = lex::prevloc(lexer), expr = ast::binding_expr { is_static = false, kind = binding_kind, bindings = bindings, }, })!; case void => kind = ast::for_kind::ACCUMULATOR; yield null; }; const cond: nullable *ast::expr = null; const afterthought: nullable *ast::expr = null; if (kind as ast::for_kind == ast::for_kind::ACCUMULATOR) { cond = alloc(expr(lexer)?)!; match (try(lexer, ltok::SEMICOLON)) { case common::token => afterthought = alloc(expr(lexer)?)!; case void => void; }; }; want(lexer, ltok::RPAREN)?; const body = alloc(expr(lexer)?)!; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = ast::for_expr { kind = kind as ast::for_kind, bindings = bindings, cond = cond, afterthought = afterthought, body = body, label = label, }, }; }; fn free_expr(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::FREE)?; want(lexer, ltok::LPAREN)?; const expr = alloc(expr(lexer)?)!; want(lexer, ltok::RPAREN)?; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = expr: ast::free_expr, }; }; fn if_expr(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::IF)?; want(lexer, ltok::LPAREN)?; const cond = alloc(expr(lexer)?)!; want(lexer, ltok::RPAREN)?; const tbranch = alloc(expr(lexer)?)!; const fbranch: nullable *ast::expr = match (try(lexer, ltok::ELSE)?) { case void => yield null; case common::token => yield alloc(expr(lexer)?)!; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::if_expr { cond = cond, tbranch = tbranch, fbranch = fbranch, }, }; }; fn indexing(lexer: *lex::lexer, lvalue: ast::expr) (ast::expr | error) = { let is_slice = false; let start: nullable *ast::expr = null, end: nullable *ast::expr = null; if (try(lexer, ltok::DOUBLE_DOT)? is common::token) { is_slice = true; } else { start = alloc(expr(lexer)?)!; }; if (!is_slice && try(lexer, ltok::DOUBLE_DOT)? is common::token) { is_slice = true; }; if (is_slice && peek(lexer, ltok::RBRACKET)? is void) { end = alloc(expr(lexer)?)!; }; want(lexer, ltok::RBRACKET)?; return ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = if (is_slice) ast::slice_expr { object = alloc(lvalue)!, start = start, end = end, } else ast::access_index { object = alloc(lvalue)!, index = { assert(end == null); yield start as *ast::expr; }, }, }; }; fn objsel(lexer: *lex::lexer) (ast::expr | error) = { let expr = postfix(lexer, void)?; synassert(lex::mkloc(lexer), expr.expr is ast::access_expr, "Expected object selector")?; return expr; }; fn idxexpr(lexer: *lex::lexer) (ast::expr | error) = { const expr = postfix(lexer, void)?; synassert(lex::mkloc(lexer), expr.expr is ast::access_expr && expr.expr as ast::access_expr is ast::access_index, "Expected indexing expression")?; return expr; }; fn plain_expression(lexer: *lex::lexer) (ast::expr | error) = { let tok = peek(lexer)? as common::token; if (tok.0 >= ltok::LIT_U8 && tok.0 <= ltok::LAST_LITERAL) { return literal(lexer); }; switch (tok.0) { case ltok::TRUE, ltok::FALSE, ltok::NULL, ltok::VOID, ltok::DONE, ltok::NOMEM => return literal(lexer); case ltok::LBRACKET => return plain_array(lexer)?; case ltok::STRUCT => let s = plain_struct(lexer, [])?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = s, }; case ltok::LPAREN => want(lexer, ltok::LPAREN)?; let ex = expr(lexer)?; switch (want(lexer, ltok::RPAREN, ltok::COMMA)?.0) { case ltok::RPAREN => return ex; case ltok::COMMA => return plain_tuple(lexer, ex, tok.2)?; case => abort(); }; case ltok::NAME => let id = ident(lexer)?; match (peek(lexer, ltok::LBRACE)?) { case void => return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = id: ast::access_identifier, }; case common::token => let s = plain_struct(lexer, id)?; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = s, }; }; case => return syntaxerr(lex::mkloc(lexer), "Unexpected {}, was expecting an expression", common::tokstr(tok)); }; }; fn plain_array(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::LBRACKET)?; let values: []*ast::expr = []; let expand = false; for (true) { match (try(lexer, ltok::RBRACKET)?) { case common::token => break; case void => void; }; append(values, alloc(expr(lexer)?)!)!; match (try(lexer, ltok::COMMA, ltok::ELLIPSIS)?) { case void => want(lexer, ltok::RBRACKET)?; break; case let tok: common::token => switch (tok.0) { case ltok::ELLIPSIS => expand = true; want(lexer, ltok::RBRACKET)?; break; case ltok::COMMA => void; case => abort(); }; }; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::array_literal { expand = expand, values = values, }, }; }; fn plain_struct( lexer: *lex::lexer, alias: ast::ident, ) (ast::struct_literal | error) = { if (len(alias) == 0) { want(lexer, ltok::STRUCT)?; }; want(lexer, ltok::LBRACE)?; let autofill = false; let fields: [](ast::struct_value | *ast::struct_literal) = []; for (true) { const tok = want(lexer, ltok::ELLIPSIS, ltok::NAME, ltok::STRUCT)?; switch (tok.0) { case ltok::ELLIPSIS => synassert(lex::mkloc(lexer), len(alias) != 0, "Cannot use auto-fill with anonymous struct")?; autofill = true; want(lexer, ltok::RBRACE)?; break; case ltok::NAME, ltok::STRUCT => lex::unlex(lexer, tok); append(fields, struct_field(lexer)?)!; case => abort(); // unreachable }; switch (want(lexer, ltok::COMMA, ltok::RBRACE)?.0) { case ltok::RBRACE => break; case ltok::COMMA => if (try(lexer, ltok::RBRACE)? is common::token) { break; }; case => abort(); // unreachable }; }; return ast::struct_literal { autofill = autofill, alias = alias, fields = fields, }; }; fn struct_field( lexer: *lex::lexer, ) (ast::struct_value | *ast::struct_literal | error) = { const tok = want(lexer, ltok::NAME, ltok::STRUCT)?; switch (tok.0) { case ltok::NAME => const name = strings::dup(tok.1 as str)!; const tok = match (try(lexer, ltok::COLON, ltok::DOUBLE_COLON, ltok::EQUAL)?) { case let tok: common::token => yield tok; case void => let id: ast::ident = alloc([name])!; return alloc(plain_struct(lexer, id)?)!; }; switch (tok.0) { case ltok::COLON => const _type = alloc(_type(lexer)?)!; want(lexer, ltok::EQUAL)?; const init = alloc(expr(lexer)?)!; return ast::struct_value { name = name, _type = _type, init = init, }; case ltok::DOUBLE_COLON => let id: ast::ident = alloc([name])!; let rest = ident(lexer)?; append(id, rest...)!; return alloc(plain_struct(lexer, id)?)!; case ltok::EQUAL => return ast::struct_value { name = name, _type = null, init = alloc(expr(lexer)?)!, }; case => abort(); // Invariant }; case ltok::STRUCT => lex::unlex(lexer, tok); return alloc(plain_struct(lexer, [])?)!; case => abort(); // Invariant }; }; fn plain_tuple( lexer: *lex::lexer, ex: ast::expr, start: common::location ) (ast::expr | error) = { let values: []*ast::expr = []; append(values, alloc(ex)!)!; for (true) { append(values, alloc(expr(lexer)?)!)!; match (try(lexer, ltok::COMMA)?) { case common::token => match (try(lexer, ltok::RPAREN)) { case common::token => break; case => void; }; case void => want(lexer, ltok::RPAREN)?; break; }; }; return ast::expr { start = start, end = lex::prevloc(lexer), expr = values: ast::tuple_literal, }; }; fn postfix(lexer: *lex::lexer, lvalue: (ast::expr | void)) (ast::expr | error) = { let lvalue = match (lvalue) { case void => yield builtin(lexer)?; case let ex: ast::expr => yield ex; }; let tok = match (try(lexer, ltok::LPAREN, ltok::DOT, ltok::LBRACKET, ltok::QUESTION, ltok::LNOT)?) { case void => return lvalue; case let tok: common::token => yield tok; }; let next = switch (tok.0) { case ltok::LPAREN => lex::unlex(lexer, tok); yield call(lexer, lvalue)?; case ltok::DOT => yield postfix_dot(lexer, lvalue)?; case ltok::LBRACKET => yield indexing(lexer, lvalue)?; case ltok::QUESTION => yield ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = alloc(lvalue)!: ast::propagate_expr, }; case ltok::LNOT => yield ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = alloc(lvalue)!: ast::error_assert_expr, }; case => abort(); }; return postfix(lexer, next); }; fn postfix_dot( lexer: *lex::lexer, lvalue: ast::expr, ) (ast::expr | error) = { match (try(lexer, ltok::NAME)?) { case let tok: common::token => return ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = ast::access_field { object = alloc(lvalue)!, field = tok.1 as str, }, }; case void => let lit = literal(lexer)?; let val = lit.expr as ast::literal_expr; synassert(lex::mkloc(lexer), val is ast::number_literal, "Expected integer literal")?; let val = val as ast::number_literal; return ast::expr { start = lvalue.start, end = lex::prevloc(lexer), expr = ast::access_tuple { object = alloc(lvalue)!, value = alloc(lit)!, }, }; }; }; fn static_expr(lexer: *lex::lexer) (ast::expr | error) = { const tok = want(lexer, ltok::ABORT, ltok::ASSERT, ltok::APPEND, ltok::INSERT, ltok::DELETE)?; lex::unlex(lexer, tok); switch (tok.0) { case ltok::ABORT, ltok::ASSERT => return assert_expr(lexer, true); case ltok::APPEND, ltok::INSERT => let expr = append_insert_expr(lexer, true)?; return postfix(lexer, expr); case ltok::DELETE => return delete_expr(lexer, true); case => abort(); // unreachable }; }; fn switch_expr(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::SWITCH)?; const label = if (try(lexer, ltok::COLON)? is common::token) { const tok = want(lexer, ltok::NAME)?; yield tok.1 as str; } else ""; want(lexer, ltok::LPAREN)?; const value = expr(lexer)?; want(lexer, ltok::RPAREN)?; want(lexer, ltok::LBRACE)?; let cases: []ast::switch_case = []; for (true) { want(lexer, ltok::CASE)?; let opts: []*ast::expr = []; if (try(lexer, ltok::ARROW)? is void) for (true) { append(opts, alloc(expr(lexer)?)!)!; switch (want(lexer, ltok::ARROW, ltok::COMMA)?.0) { case ltok::ARROW => break; case ltok::COMMA => if (try(lexer, ltok::ARROW)? is common::token) { break; }; case => abort(); // unreachable }; }; let exprs: []*ast::expr = []; for (true) { append(exprs, alloc(stmt(lexer)?)!)!; match (peek(lexer, ltok::CASE, ltok::RBRACE)?) { case common::token => break; case void => void; }; }; append(cases, ast::switch_case { options = opts, exprs = exprs, })!; if (try(lexer, ltok::RBRACE)? is common::token) { break; }; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::switch_expr { value = alloc(value)!, cases = cases, label = label, }, }; }; fn match_case(lexer: *lex::lexer) (ast::match_case | error) = { want(lexer, ltok::CASE)?; let tok = lex::lex(lexer)?; let loc = tok.2; let name: str = "", typ: nullable *ast::_type = null; switch (tok.0) { case ltok::NULL => typ = alloc(ast::_type { start = loc, end = lex::prevloc(lexer), flags = 0, repr = ast::builtin_type::NULL, })!; case ltok::LET => name = want(lexer, ltok::NAME)?.1 as str; want(lexer, ltok::COLON)?; typ = alloc(_type(lexer)?)!; case ltok::ARROW => lex::unlex(lexer, tok); case => lex::unlex(lexer, tok); typ = alloc(_type(lexer)?)!; }; want(lexer, ltok::ARROW)?; let exprs: []*ast::expr = []; for (true) { append(exprs, alloc(stmt(lexer)?)!)!; if (peek(lexer, ltok::CASE, ltok::RBRACE)? is common::token) { break; }; }; return ast::match_case { name = name, _type = typ, exprs = exprs, }; }; fn match_expr(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::MATCH)?; const label = if (try(lexer, ltok::COLON)? is common::token) { const tok = want(lexer, ltok::NAME)?; yield tok.1 as str; } else ""; want(lexer, ltok::LPAREN)?; const value = expr(lexer)?; want(lexer, ltok::RPAREN)?; want(lexer, ltok::LBRACE)?; let cases: []ast::match_case = []; for (true) { append(cases, match_case(lexer)?)!; if (try(lexer, ltok::RBRACE)? is common::token) { break; }; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::match_expr { value = alloc(value)!, cases = cases, label = label, }, }; }; fn unarithm(lexer: *lex::lexer) (ast::expr | error) = { const tok = match (try(lexer, ltok::MINUS, ltok::BNOT, ltok::LNOT, ltok::TIMES, ltok::BAND, ltok::SWITCH, ltok::MATCH, ltok::COLON, ltok::LBRACE)?) { case void => return postfix(lexer, void); case let tok: common::token => yield switch (tok.0) { case ltok::SWITCH => lex::unlex(lexer, tok); return switch_expr(lexer); case ltok::MATCH => lex::unlex(lexer, tok); return match_expr(lexer); case ltok::COLON, ltok::LBRACE => lex::unlex(lexer, tok); return compound_expr(lexer); case => yield tok; }; }; const op = switch (tok.0) { case ltok::MINUS => yield ast::unarithm_op::MINUS; case ltok::BNOT => yield ast::unarithm_op::BNOT; case ltok::LNOT => yield ast::unarithm_op::LNOT; case ltok::TIMES => yield ast::unarithm_op::DEREF; case ltok::BAND => yield ast::unarithm_op::ADDR; case => abort(); }; const operand = unarithm(lexer)?; const expr = :blk { if (op == ast::unarithm_op::MINUS) match (operand.expr) { case let c: ast::literal_expr => match (c) { case let n: ast::number_literal => let sign = false; const val = match (n.value) { case let i: i64 => sign = i < 0; yield -i; case let u: u64 => void; case let f: f64 => sign = math::signf64(f) < 0; yield -f; }; if (val is void) yield; yield :blk, ast::number_literal { suff = n.suff, value = val as (i64 | f64), sign = sign, }: ast::literal_expr; case => void; }; case => void; }; yield ast::unarithm_expr { op = op, operand = alloc(operand)!, }; }; return ast::expr { start = tok.2, end = lex::prevloc(lexer), expr = expr, }; }; fn yield_expr(lexer: *lex::lexer) (ast::expr | error) = { const start = want(lexer, ltok::YIELD)?; let label = ""; let value: nullable *ast::expr = null; match (try(lexer, ltok::COLON, ltok::COMMA, ltok::ELSE, ltok::RBRACE, ltok::RBRACKET, ltok::RPAREN, ltok::SEMICOLON, ltok::EOF)?) { case void => value = alloc(expr(lexer)?)!; case let t: common::token => if (t.0 == ltok::COLON) { label = want(lexer, ltok::NAME)?.1 as str; match (try(lexer, ltok::COMMA)?) { case void => void; case common::token => value = alloc(expr(lexer)?)!; }; } else { lex::unlex(lexer, t); }; }; return ast::expr { start = start.2, end = lex::prevloc(lexer), expr = ast::yield_expr { label = label, value = value, }, }; }; fn binop_for_tok(tok: common::token) ast::binarithm_op = { switch (tok.0) { case ltok::BAND => return ast::binarithm_op::BAND; case ltok::BOR => return ast::binarithm_op::BOR; case ltok::BXOR => return ast::binarithm_op::BXOR; case ltok::DIV => return ast::binarithm_op::DIV; case ltok::GT => return ast::binarithm_op::GT; case ltok::GTEQ => return ast::binarithm_op::GTEQ; case ltok::LAND => return ast::binarithm_op::LAND; case ltok::LEQUAL => return ast::binarithm_op::LEQUAL; case ltok::LESS => return ast::binarithm_op::LESS; case ltok::LESSEQ => return ast::binarithm_op::LESSEQ; case ltok::LOR => return ast::binarithm_op::LOR; case ltok::LSHIFT => return ast::binarithm_op::LSHIFT; case ltok::LXOR => return ast::binarithm_op::LXOR; case ltok::MINUS => return ast::binarithm_op::MINUS; case ltok::MODULO => return ast::binarithm_op::MODULO; case ltok::NEQUAL => return ast::binarithm_op::NEQUAL; case ltok::PLUS => return ast::binarithm_op::PLUS; case ltok::RSHIFT => return ast::binarithm_op::RSHIFT; case ltok::TIMES => return ast::binarithm_op::TIMES; case => abort(); }; }; fn precedence(tok: common::token) int = { switch (tok.0) { case ltok::LOR => return 0; case ltok::LXOR => return 1; case ltok::LAND => return 2; case ltok::LEQUAL, ltok::NEQUAL => return 3; case ltok::LESS, ltok::LESSEQ, ltok::GT, ltok::GTEQ => return 4; case ltok::BOR => return 5; case ltok::BXOR => return 6; case ltok::BAND => return 7; case ltok::LSHIFT, ltok::RSHIFT => return 8; case ltok::PLUS, ltok::MINUS => return 9; case ltok::TIMES, ltok::DIV, ltok::MODULO => return 10; case => return -1; }; }; hare-update-0.25.2.0/v0_25_2/parse/hooks.ha000066400000000000000000000030721503370650000177610ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use common::{nonterminal}; use v0_25_2::lex; export type hook = struct { func: *hookfunc, user: nullable *opaque, }; let hooks: [nonterminal::LAST + 1][]hook = [[]...]; // A function which can be called when the parser encounters a specific // nonterminal. export type hookfunc = fn( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | error); // Registers a parser hook. The hook will be called when a non-terminal is // encountered. During hook execution, the parser state is saved, and is // restored after the hook executes. The hook may make use of parse or lex // functions to examine the token stream while it executes, and parsing will // resume from prior to the hook's execution. // // Hook execution is disabled while hooks are being executed, so recursive hooks // will not fire if a hook makes use of the parser. export fn register_hook( target: nonterminal, func: *hookfunc, user: nullable *opaque, ) void = { append(hooks[target], hook { func = func, user = user, })!; }; // Executes hooks associated with a nonterminal. fn on( lex: *lex::lexer, nt: nonterminal, data: nullable *opaque = null, ) (void | error) = { static let executing = false; if (executing) { return; }; executing = true; defer executing = false; for (let hook .. hooks[nt]) { let state = lex::save(lex)?; defer lex::restore(lex, &state)!; hook.func(lex, data, hook.user)?; }; }; @fini fn fini() void = { for (let hooks .. hooks) { free(hooks); }; }; hare-update-0.25.2.0/v0_25_2/parse/ident.ha000066400000000000000000000036401503370650000177420ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use bufio; use common; use common::{ltok, nonterminal}; use memio; use strings; use v0_25_2::ast; use v0_25_2::lex; // Parses a single identifier, possibly with a trailing ::, i.e. 'foo::bar::'. // Returns the identifier and whether there's a trailing ::. export fn ident_trailing(lexer: *lex::lexer) ((ast::ident, bool) | error) = { let ident: []str = []; let trailing = false; const tok = want(lexer, ltok::NAME)?; append(ident, tok.1 as str)!; const loc = tok.2; let z = len(ident[0]); for (true) { match (try(lexer, ltok::DOUBLE_COLON)?) { case void => break; case => void; // Grab the next ident }; z += 1; let name = match (try(lexer, ltok::NAME, ltok::NOMEM)?) { case let t: common::token => // 0.24.2 should parse "errors::nomem" as an ident yield switch (t.0) { case ltok::NAME => yield t.1 as str; case ltok::NOMEM => yield strings::dup("nomem")!; case => abort(); }; case void => trailing = true; break; }; append(ident, name)!; z += len(name); }; if (z > ast::IDENT_MAX) { ast::ident_free(ident: ast::ident); return syntaxerr(loc, "Identifier exceeds maximum length"); }; return (ident: ast::ident, trailing); }; // Parses a single identifier, i.e. 'foo::bar::baz'. export fn ident(lexer: *lex::lexer) (ast::ident | error) = { on(lexer, nonterminal::IDENTIFIER)?; let ident = ident_trailing(lexer)?; synassert(lex::mkloc(lexer), !ident.1, "Unexpected trailing :: in ident")?; return ident.0; }; // A convenience function which parses an identifier from a string, so the // caller needn't provide a lexer instance. export fn identstr(in: str) (ast::ident | error) = { let in = memio::fixed(strings::toutf8(in)); let sc = bufio::newscanner(&in); defer bufio::finish(&sc); let lexer = lex::init(&sc, ""); let ret = ident(&lexer); want(&lexer, ltok::EOF)?; return ret; }; hare-update-0.25.2.0/v0_25_2/parse/import.ha000066400000000000000000000036751503370650000201610ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common::{token, ltok, nonterminal}; use v0_25_2::ast; use v0_25_2::lex; fn name_list(lexer: *lex::lexer) (ast::import_members | error) = { let names: []str = []; for (true) { append(names, want(lexer, ltok::NAME)?.1 as str)!; switch (want(lexer, ltok::COMMA, ltok::RBRACE)?.0) { case ltok::COMMA => match (try(lexer, ltok::RBRACE)?) { case void => void; case => return names; }; case ltok::RBRACE => return names; case => abort(); // Unreachable }; }; }; // Parses the import list for a sub-unit export fn imports(lexer: *lex::lexer) ([]ast::import | error) = { on(lexer, nonterminal::IMPORTS, null)?; let imports: []ast::import = []; for (true) { const start = match (try(lexer, ltok::USE)?) { case void => break; case let tok: token => yield tok.2; }; append(imports, ast::import { bindings = void, ... })!; let import = &imports[len(imports) - 1]; import.start = start; let (name, trailing) = ident_trailing(lexer)?; import.ident = name; switch (want(lexer, ltok::SEMICOLON, ltok::LBRACE, ltok::EQUAL, ltok::TIMES)?.0) { case ltok::SEMICOLON => synassert(lex::mkloc(lexer), !trailing, "Unexpected trailing :: in ident")?; case ltok::LBRACE => synassert(lex::mkloc(lexer), trailing, "Expected trailing :: in ident")?; import.bindings = name_list(lexer)?; want(lexer, ltok::SEMICOLON)?; case ltok::EQUAL => synassert(lex::mkloc(lexer), len(name) == 1 && !trailing, "Expected name, not ident")?; import.bindings = name[0]; free(name); import.ident = ident(lexer)?; want(lexer, ltok::SEMICOLON)?; case ltok::TIMES => synassert(lex::mkloc(lexer), trailing, "Expected trailing :: in ident")?; import.bindings = ast::import_wildcard; want(lexer, ltok::SEMICOLON)?; case => abort(); // Unreachable }; import.end = lex::mkloc(lexer); }; return imports; }; hare-update-0.25.2.0/v0_25_2/parse/type.ha000066400000000000000000000310061503370650000176150ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use common::{ltok, nonterminal}; use strings; use v0_25_2::ast; use v0_25_2::ast::{builtin_type}; use v0_25_2::lex; export fn prototype(lexer: *lex::lexer) (ast::func_type | error) = { on(lexer, nonterminal::PROTOTYPE, null)?; let variadism = ast::variadism::NONE; let params: []ast::func_param = []; want(lexer, ltok::LPAREN)?; for (try(lexer, ltok::RPAREN)? is void) { let loc = lex::mkloc(lexer); match (try(lexer, ltok::ELLIPSIS)?) { case common::token => variadism = ast::variadism::C; want(lexer, ltok::RPAREN)?; break; case void => void; }; let name_or_type = _type(lexer)?; match (try(lexer, ltok::COLON)?) { case void => append(params, ast::func_param { loc = loc, name = "", _type = alloc(name_or_type)!, default_value = void, })!; case common::token => // Bit of a hack because we can't unlex twice. synassert(loc, name_or_type.repr is ast::alias_type, "Invalid parameter name")?; let ns = (name_or_type.repr as ast::alias_type).ident; synassert(loc, len(ns) == 1, "Invalid parameter name")?; append(params, ast::func_param { loc = loc, name = ns[0], _type = alloc(_type(lexer)?)!, default_value = void, })!; }; match (try(lexer, ltok::EQUAL)?) { case void => yield void; case common::token => params[len(params) - 1].default_value = expr(lexer)?; }; match (try(lexer, ltok::ELLIPSIS)?) { case common::token => variadism = ast::variadism::HARE; want(lexer, ltok::RPAREN)?; break; case void => void; }; match (try(lexer, ltok::COMMA)?) { case void => want(lexer, ltok::RPAREN)?; break; case common::token => void; }; }; let t = _type(lexer)?; return ast::func_type { result = alloc(t)!, variadism = variadism, params = params, }; }; fn integer_type( lexer: *lex::lexer, ) (builtin_type | error) = { switch (want(lexer)?.0) { case ltok::INT => return builtin_type::INT; case ltok::I8 => return builtin_type::I8; case ltok::I16 => return builtin_type::I16; case ltok::I32 => return builtin_type::I32; case ltok::I64 => return builtin_type::I64; case ltok::SIZE => return builtin_type::SIZE; case ltok::UINT => return builtin_type::UINT; case ltok::UINTPTR => return builtin_type::UINTPTR; case ltok::U8 => return builtin_type::U8; case ltok::U16 => return builtin_type::U16; case ltok::U32 => return builtin_type::U32; case ltok::U64 => return builtin_type::U64; case => return syntaxerr(lex::mkloc(lexer), "Expected integer type"); }; }; fn primitive_type(lexer: *lex::lexer) (ast::_type | error) = { let tok = want(lexer)?; let builtin = switch (tok.0) { case ltok::I8, ltok::I16, ltok::I32, ltok::I64, ltok::INT, ltok::UINT, ltok::UINTPTR, ltok::SIZE, ltok::U8, ltok::U16, ltok::U32, ltok::U64 => lex::unlex(lexer, tok); yield integer_type(lexer)?; case ltok::RUNE => yield builtin_type::RUNE; case ltok::STR => yield builtin_type::STR; case ltok::F32 => yield builtin_type::F32; case ltok::F64 => yield builtin_type::F64; case ltok::BOOL => yield builtin_type::BOOL; case ltok::DONE => yield builtin_type::DONE; case ltok::VALIST => yield builtin_type::VALIST; case ltok::VOID => yield builtin_type::VOID; case ltok::OPAQUE => yield builtin_type::OPAQUE; case ltok::NEVER => yield builtin_type::NEVER; case ltok::NOMEM => yield builtin_type::NOMEM; case => return syntaxerr(lex::mkloc(lexer), "Unexpected {}, was expecting primitive type", common::tokstr(tok)); }; return ast::_type { start = tok.2, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = builtin, }; }; fn alias_type(lexer: *lex::lexer) (ast::_type | error) = { const start = lex::mkloc(lexer); let unwrap = try(lexer, ltok::ELLIPSIS)? is common::token; let ident = ident(lexer)?; return ast::_type { start = start, end = lex::prevloc(lexer), flags = 0, repr = ast::alias_type { unwrap = unwrap, ident = ident, }, }; }; fn pointer_type(lexer: *lex::lexer) (ast::_type | error) = { const start = lex::mkloc(lexer); let flags = match (try(lexer, ltok::NULLABLE)?) { case void => yield ast::pointer_flag::NONE; case => yield ast::pointer_flag::NULLABLE; }; want(lexer, ltok::TIMES)?; let _type = _type(lexer)?; return ast::_type { start = start, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = ast::pointer_type { referent = alloc(_type)!, flags = flags, }, }; }; fn tagged_type( lexer: *lex::lexer, first: ast::_type, start: common::location ) (ast::_type | error) = { let tagged: ast::tagged_type = []; append(tagged, alloc(first)!)!; for (true) { append(tagged, alloc(_type(lexer)?)!)!; match (try(lexer, ltok::BOR)?) { case common::token => match (try(lexer, ltok::RPAREN)) { case common::token => break; case => void; }; case void => want(lexer, ltok::RPAREN)?; break; }; }; return ast::_type { start = start, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = tagged, }; }; fn tuple_type( lexer: *lex::lexer, first: ast::_type, start: common::location ) (ast::_type | error) = { let tuple: ast::tuple_type = []; append(tuple, alloc(first)!)!; for (true) { append(tuple, alloc(_type(lexer)?)!)!; match (try(lexer, ltok::COMMA)?) { case common::token => match (try(lexer, ltok::RPAREN)) { case common::token => break; case => void; }; case void => want(lexer, ltok::RPAREN)?; break; }; }; return ast::_type { start = start, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = tuple, }; }; fn fn_type(lexer: *lex::lexer) (ast::_type | error) = { const start = lex::mkloc(lexer); want(lexer, ltok::FN)?; let proto = prototype(lexer)?; return ast::_type { start = start, end = lex::prevloc(lexer), flags = 0, repr = proto, }; }; fn struct_union_type(lexer: *lex::lexer) (ast::_type | error) = { let membs: []ast::struct_member = []; let kind = want(lexer, ltok::STRUCT, ltok::UNION)?; let packed = false; if (kind.0 == ltok::STRUCT && try(lexer, ltok::ATTR_PACKED)? is common::token) { packed = true; }; want(lexer, ltok::LBRACE)?; for (true) { if (try(lexer, ltok::RBRACE)? is common::token) { synassert(lex::mkloc(lexer), len(membs) != 0, "Expected field list")?; break; }; let comment = ""; let offs: nullable *ast::expr = match (try(lexer, ltok::ATTR_OFFSET)?) { case void => yield null; case common::token => comment = strings::dup(lex::comment(lexer))!; want(lexer, ltok::LPAREN)?; let ex = expr(lexer)?; want(lexer, ltok::RPAREN)?; yield alloc(ex)!; }; let tok = want(lexer, ltok::NAME, ltok::STRUCT, ltok::UNION)?; if (comment == "") { comment = strings::dup(lex::comment(lexer))!; }; switch (tok.0) { case ltok::NAME => lex::unlex(lexer, tok); let memb = struct_embed_or_field(lexer, offs, comment)?; append(membs, memb)!; case ltok::STRUCT, ltok::UNION => lex::unlex(lexer, tok); let subtype = struct_union_type(lexer)?; append(membs, ast::struct_member { _offset = offs, member = alloc(subtype)!, docs = comment, })!; case => abort(); }; switch (want(lexer, ltok::RBRACE, ltok::COMMA)?.0) { case ltok::RBRACE => break; case ltok::COMMA => const linecomment = lex::comment(lexer); const docs = &membs[len(membs) - 1].docs; if (linecomment != "" && *docs == "") { *docs = strings::dup(linecomment)!; free(lexer.comment); lexer.comment = ""; }; case => abort(); }; }; return ast::_type { start = kind.2, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = switch (kind.0) { case ltok::STRUCT => yield ast::struct_type { members = membs, packed = packed, ...}; case ltok::UNION => yield membs: ast::union_type; case => abort(); }, }; }; fn struct_embed_or_field( lexer: *lex::lexer, offs: nullable *ast::expr, comment: str, ) (ast::struct_member | error) = { // Disambiguates between `name: type` and `identifier` // // struct-union-field // name : type // identifier // // identifier // name // name :: identifier let name = want(lexer, ltok::NAME)?; let id: ast::ident = match (try(lexer, ltok::COLON, ltok::DOUBLE_COLON)?) { case void => yield alloc([name.1 as str])!; case let tok: common::token => yield switch (tok.0) { case ltok::COLON => let field = ast::struct_field { name = name.1 as str, _type = alloc(_type(lexer)?)!, }; return ast::struct_member { _offset = offs, member = field, docs = comment, }; case ltok::DOUBLE_COLON => let id = ident(lexer)?; insert(id[0], name.1 as str)!; yield id; case => abort(); }; }; return ast::struct_member { _offset = offs, member = id: ast::struct_alias, docs = comment, }; }; fn array_slice_type(lexer: *lex::lexer) (ast::_type | error) = { let start = want(lexer, ltok::LBRACKET)?; let length = match (try(lexer, ltok::UNDERSCORE, ltok::TIMES, ltok::RBRACKET)?) { case void => yield alloc(expr(lexer)?)!; case let tok: common::token => yield switch (tok.0) { case ltok::UNDERSCORE => yield ast::len_contextual; case ltok::TIMES => yield ast::len_unbounded; case ltok::RBRACKET => yield ast::len_slice; case => abort(); }; }; if (!(length is ast::len_slice)) { want(lexer, ltok::RBRACKET)?; }; let _type = _type(lexer)?; return ast::_type { start = start.2, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = ast::list_type { length = length, members = alloc(_type)!, }, }; }; fn enum_type(lexer: *lex::lexer) (ast::_type | error) = { let start = want(lexer, ltok::ENUM)?; const storage = match (try(lexer, ltok::LBRACE, ltok::RUNE)?) { case void => let storage = integer_type(lexer)?; want(lexer, ltok::LBRACE)?; yield storage; case let tok: common::token => yield switch (tok.0) { case ltok::LBRACE => yield builtin_type::INT; case ltok::RUNE => want(lexer, ltok::LBRACE)?; yield builtin_type::RUNE; case => abort(); // unreachable }; }; let membs: []ast::enum_field = []; for (true) { if (try(lexer, ltok::RBRACE)? is common::token) { synassert(lex::mkloc(lexer), len(membs) != 0, "Expected member list")?; break; }; const loc = lex::mkloc(lexer); let name = want(lexer, ltok::NAME)?; let comment = strings::dup(lex::comment(lexer))!; let value: nullable *ast::expr = if (try(lexer, ltok::EQUAL)? is common::token) alloc(expr(lexer)?)! else null; defer append(membs, ast::enum_field { name = name.1 as str, value = value, loc = loc, docs = comment, })!; switch (want(lexer, ltok::COMMA, ltok::RBRACE)?.0) { case ltok::COMMA => const linecomment = lex::comment(lexer); if (linecomment != "" && comment == "") { free(comment); comment = strings::dup(linecomment)!; free(lexer.comment); lexer.comment = ""; }; case ltok::RBRACE => break; case => abort(); }; }; return ast::_type { start = start.2, end = lex::prevloc(lexer), flags = ast::type_flag::NONE, repr = ast::enum_type { storage = storage, values = membs, }, }; }; // Parses a type, e.g. '[]int'. export fn _type(lexer: *lex::lexer) (ast::_type | error) = { let flags = ast::type_flag::NONE; if (try(lexer, ltok::CONST)? is common::token) { flags |= ast::type_flag::CONST; }; if (try(lexer, ltok::LNOT)? is common::token) { flags |= ast::type_flag::ERROR; }; let tok = peek(lexer)? as common::token; let typ: ast::_type = switch (tok.0) { case ltok::RUNE, ltok::STR, ltok::BOOL, ltok::DONE, ltok::I8, ltok::I16, ltok::I32, ltok::I64, ltok::U8, ltok::U16, ltok::U32, ltok::U64, ltok::INT, ltok::UINT, ltok::UINTPTR, ltok::SIZE, ltok::F32, ltok::F64, ltok::VALIST, ltok::VOID, ltok::OPAQUE, ltok::NEVER, ltok::NOMEM => yield primitive_type(lexer)?; case ltok::ENUM => yield enum_type(lexer)?; case ltok::NULLABLE, ltok::TIMES => yield pointer_type(lexer)?; case ltok::STRUCT, ltok::UNION => yield struct_union_type(lexer)?; case ltok::LBRACKET => yield array_slice_type(lexer)?; case ltok::LPAREN => want(lexer, ltok::LPAREN)?; let t = _type(lexer)?; yield switch (want(lexer, ltok::BOR, ltok::COMMA)?.0) { case ltok::BOR => yield tagged_type(lexer, t, tok.2)?; case ltok::COMMA => yield tuple_type(lexer, t, tok.2)?; case => abort("unreachable"); }; case ltok::FN => yield fn_type(lexer)?; case ltok::ELLIPSIS, ltok::NAME => yield alias_type(lexer)?; case => return syntaxerr(lex::mkloc(lexer), "Unexpected {}, was expecting type", common::tokstr(tok)); }; typ.flags |= flags; return typ; }; hare-update-0.25.2.0/v0_25_2/parse/unit.ha000066400000000000000000000005331503370650000176140ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use v0_25_2::ast; use v0_25_2::lex; // Parses an entire subunit (i.e. one Hare source file). export fn subunit(lexer: *lex::lexer) (ast::subunit | error) = { let i = imports(lexer)?; let d = decls(lexer)?; return ast::subunit { imports = i, decls = d, }; }; hare-update-0.25.2.0/v0_25_2/parse/util.ha000066400000000000000000000037231503370650000176160ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common; use common::{ltok}; use fmt; use io; use memio; use v0_25_2::lex; // Requires the next token to have a matching ltok. Returns that token, or an // error. export fn want(lexer: *lex::lexer, want: common::ltok...) (common::token | error) = { let tok = lex::lex(lexer)?; if (len(want) == 0) { return tok; }; for (let i = 0z; i < len(want); i += 1) { if (tok.0 == want[i]) { return tok; }; }; let buf = memio::dynamic(); defer io::close(&buf)!; for (let i = 0z; i < len(want); i += 1) { const tstr = if (want[i] == common::ltok::NAME) "name" else common::tokstr((want[i], void, lex::mkloc(lexer))); fmt::fprintf(&buf, "'{}'", tstr)!; if (i + 1 < len(want)) { fmt::fprint(&buf, ", ")!; }; }; lex::unlex(lexer, tok); return syntaxerr(lex::mkloc(lexer), "Unexpected '{}', was expecting {}", common::tokstr(tok), memio::string(&buf)!); }; // Looks for a matching ltok from the lexer, and if not present, unlexes the // token and returns void. If found, the token is consumed from the lexer and is // returned. export fn try( lexer: *lex::lexer, want: common::ltok... ) (common::token | error | void) = { let tok = lex::lex(lexer)?; assert(len(want) > 0); for (let i = 0z; i < len(want); i += 1) { if (tok.0 == want[i]) { return tok; }; }; lex::unlex(lexer, tok); }; // Looks for a matching ltok from the lexer, unlexes the token, and returns // it; or void if it was not an ltok. export fn peek( lexer: *lex::lexer, want: common::ltok... ) (common::token | error | void) = { let tok = lex::lex(lexer)?; lex::unlex(lexer, tok); if (len(want) == 0) { return tok; }; for (let i = 0z; i < len(want); i += 1) { if (tok.0 == want[i]) { return tok; }; }; }; // Returns a syntax error if cond is false and void otherwise fn synassert(loc: common::location, cond: bool, msg: str) (void | error) = { if (!cond) { return syntaxerr(loc, "{}", msg); }; }; hare-update-0.25.2.0/v0_25_2/unparse/000077500000000000000000000000001503370650000166655ustar00rootroot00000000000000hare-update-0.25.2.0/v0_25_2/unparse/README000066400000000000000000000002231503370650000175420ustar00rootroot00000000000000hare::unparse provides an unparser for Hare. All functions take in some part of the AST and write formatted Hare source code to an [[io::handle]]. hare-update-0.25.2.0/v0_25_2/unparse/decl.ha000066400000000000000000000112551503370650000201120ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use fmt; use v0_25_2::ast; use v0_25_2::lex; use io; use memio; use strings; // Unparses a [[hare::ast::decl]]. export fn decl( out: io::handle, syn: *synfunc, d: *ast::decl, ) (size | io::error) = { let n = 0z; let ctx = context { out = out, stack = &stack { cur = d, ... }, ... }; if (len(d.docs) > 0) { n += comment(&ctx, syn, d.docs)?; }; if (d.exported) { n += syn(&ctx, "export", synkind::KEYWORD)?; n += space(&ctx)?; }; match (d.decl) { case let c: []ast::decl_const => n += syn(&ctx, "def", synkind::KEYWORD)?; n += space(&ctx)?; for (let i = 0z; i < len(c); i += 1) { n += _ident(&ctx, syn, c[i].ident, synkind::CONSTANT)?; match (c[i]._type) { case null => void; case let ty: *ast::_type => n += syn(&ctx, ":", synkind::PUNCTUATION)?; n += space(&ctx)?; n += __type(&ctx, syn, ty)?; }; n += space(&ctx)?; n += syn(&ctx, "=", synkind::OPERATOR)?; n += space(&ctx)?; n += _expr(&ctx, syn, c[i].init)?; if (i + 1 < len(c)) { n += syn(&ctx, ",", synkind::PUNCTUATION)?; n += space(&ctx)?; }; }; case let g: []ast::decl_global => n += syn(&ctx, if (g[0].is_const) "const" else "let", synkind::KEYWORD)?; n += space(&ctx)?; for (let i = 0z; i < len(g); i += 1) { if (len(g[i].symbol) != 0) { n += syn(&ctx, "@symbol(", synkind::ATTRIBUTE)?; n += literal(&ctx, syn, g[i].symbol)?; n += syn(&ctx, ")", synkind::ATTRIBUTE)?; n += space(&ctx)?; } else if (g[i].is_threadlocal) { n += syn(&ctx, "@threadlocal", synkind::ATTRIBUTE)?; n += space(&ctx)?; }; n += _ident(&ctx, syn, g[i].ident, synkind::GLOBAL)?; match (g[i]._type) { case null => void; case let ty: *ast::_type => n += syn(&ctx, ":", synkind::PUNCTUATION)?; n += space(&ctx)?; n += __type(&ctx, syn, ty)?; }; match (g[i].init) { case null => void; case let ex: *ast::expr => n += space(&ctx)?; n += syn(&ctx, "=", synkind::OPERATOR)?; n += space(&ctx)?; n += _expr(&ctx, syn, ex)?; }; if (i + 1 < len(g)) { n += syn(&ctx, ",", synkind::OPERATOR)?; n += space(&ctx)?; }; }; case let t: []ast::decl_type => n += syn(&ctx, "type", synkind::KEYWORD)?; n += space(&ctx)?; for (let i = 0z; i < len(t); i += 1) { n += _ident(&ctx, syn, t[i].ident, synkind::TYPEDEF)?; n += space(&ctx)?; n += syn(&ctx, "=", synkind::OPERATOR)?; n += space(&ctx)?; n += __type(&ctx, syn, t[i]._type)?; if (i + 1 < len(t)) { n += syn(&ctx, ",", synkind::PUNCTUATION)?; n += space(&ctx)?; }; }; case let f: ast::decl_func => ctx.stack = &stack { cur = f.prototype, up = ctx.stack, ... }; defer { let stack = &(ctx.stack as *stack); match (stack.extra) { case let p: *opaque => free(p); case null => void; }; ctx.stack = stack.up; }; switch (f.attrs) { case ast::fndecl_attr::NONE => void; case ast::fndecl_attr::FINI => n += syn(&ctx, "@fini", synkind::ATTRIBUTE)?; n += space(&ctx)?; case ast::fndecl_attr::INIT => n += syn(&ctx, "@init", synkind::ATTRIBUTE)?; n += space(&ctx)?; case ast::fndecl_attr::TEST => n += syn(&ctx, "@test", synkind::ATTRIBUTE)?; n += space(&ctx)?; }; let p = f.prototype.repr as ast::func_type; if (len(f.symbol) != 0) { n += syn(&ctx, "@symbol(", synkind::ATTRIBUTE)?; n += literal(&ctx, syn, f.symbol)?; n += syn(&ctx, ")", synkind::ATTRIBUTE)?; n += space(&ctx)?; }; n += syn(&ctx, "fn", synkind::KEYWORD)?; n += space(&ctx)?; n += _ident(&ctx, syn, f.ident, synkind::FUNCTION)?; const fntype = f.prototype.repr as ast::func_type; n += prototype(&ctx, syn, &fntype)?; match (f.body) { case null => void; case let e: *ast::expr => n += space(&ctx)?; n += syn(&ctx, "=", synkind::OPERATOR)?; n += space(&ctx)?; n += _expr(&ctx, syn, e)?; }; case let e: ast::assert_expr => n += assert_expr(&ctx, syn, &e)?; }; n += syn(&ctx, ";", synkind::PUNCTUATION)?; return n; }; fn comment(ctx: *context, syn: *synfunc, s: str) (size | io::error) = { let n = 0z; let s = strings::trimsuffix(s, "\n"); let s = strings::tokenize(s, "\n"); for (let line => strings::next_token(&s)) { for (let i = 0z; i < ctx.indent; i += 1) { n += syn(ctx, "\t", synkind::COMMENT)?; ctx.linelen += 8; }; n += syn(ctx, "//", synkind::COMMENT)?; n += syn(ctx, line, synkind::COMMENT)?; n += syn(ctx, "\n", synkind::COMMENT)?; ctx.linelen = 0; }; return n; }; fn decl_test(d: *ast::decl, expected: str) bool = { let buf = memio::dynamic(); decl(&buf, &syn_nowrap, d)!; let s = memio::string(&buf)!; defer free(s); fmt::println(s)!; return s == expected; }; hare-update-0.25.2.0/v0_25_2/unparse/expr.ha000066400000000000000000000656651503370650000201770ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use common::{ltok}; use fmt; use io; use strings; use v0_25_2::ast; use v0_25_2::ast::{binarithm_op}; // Unparses a [[hare::ast::expr]]. export fn expr( out: io::handle, syn: *synfunc, e: *ast::expr, ) (size | io::error) = { let ctx = context { out = out, ... }; return _expr(&ctx, syn, e); }; fn _expr(ctx: *context, syn: *synfunc, e: *ast::expr) (size | io::error) = { ctx.stack = &stack { cur = e, up = ctx.stack, ... }; defer { let stack = &(ctx.stack as *stack); match (stack.extra) { case let p: *opaque => free(p); case null => void; }; ctx.stack = stack.up; }; match (e.expr) { case let e: ast::access_expr => match (e) { case let id: ast::access_identifier => return _ident(ctx, syn, id, synkind::IDENT); case let ix: ast::access_index => let z = 0z; const needs_parens = !is_postfix(ix.object); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, ix.object)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, "[", synkind::PUNCTUATION)?; z += _expr(ctx, syn, ix.index)?; z += syn(ctx, "]", synkind::PUNCTUATION)?; return z; case let fi: ast::access_field => let z = 0z; const needs_parens = !is_postfix(fi.object); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, fi.object)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, ".", synkind::OPERATOR)?; z += syn(ctx, fi.field, synkind::SECONDARY)?; return z; case let tp: ast::access_tuple => let z = 0z; const needs_parens = !is_postfix(tp.object); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, tp.object)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, ".", synkind::OPERATOR)?; z += _expr(ctx, syn, tp.value)?; return z; }; case let e: ast::align_expr => let z = syn(ctx, "align", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += __type(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::alloc_expr => let z = syn(ctx, "alloc", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.init)?; match (e.capacity) { case null => if (e.form == ast::alloc_form::COPY) { z += syn(ctx, "...", synkind::OPERATOR)?; }; case let e: *ast::expr => z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, e)?; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case ast::append_expr => return append_insert_expr(ctx, syn, e); case let e: ast::assert_expr => return assert_expr(ctx, syn, &e); case let e: ast::assign_expr => let z = 0z; z += _expr(ctx, syn, e.object)?; const op = match (e.op) { case void => yield "="; case let op: binarithm_op => yield switch (op) { case binarithm_op::BAND => yield "&="; case binarithm_op::LAND => yield "&&="; case binarithm_op::BOR => yield "|="; case binarithm_op::LOR => yield "||="; case binarithm_op::DIV => yield "/="; case binarithm_op::LSHIFT => yield "<<="; case binarithm_op::MINUS => yield "-="; case binarithm_op::MODULO => yield "%="; case binarithm_op::PLUS => yield "+="; case binarithm_op::RSHIFT => yield ">>="; case binarithm_op::TIMES => yield "*="; case binarithm_op::BXOR => yield "^="; case binarithm_op::LXOR => yield "^^="; case binarithm_op::GT, binarithm_op::GTEQ, binarithm_op::LESS, binarithm_op::LESSEQ, binarithm_op::LEQUAL, binarithm_op::NEQUAL => abort(); // unreachable }; }; z += space(ctx)?; z += syn(ctx, op, synkind::OPERATOR)?; z += space(ctx)?; z += _expr(ctx, syn, e.value)?; return z; case let e: ast::binarithm_expr => const prec = binprecedence(e.op); let z = binexprval(ctx, syn, e.lvalue, prec)?; z += space(ctx)?; z += syn(ctx, switch (e.op) { case binarithm_op::BAND => yield "&"; case binarithm_op::BOR => yield "|"; case binarithm_op::DIV => yield "/"; case binarithm_op::GT => yield ">"; case binarithm_op::GTEQ => yield ">="; case binarithm_op::LAND => yield "&&"; case binarithm_op::LEQUAL => yield "=="; case binarithm_op::LESS => yield "<"; case binarithm_op::LESSEQ => yield "<="; case binarithm_op::LOR => yield "||"; case binarithm_op::LSHIFT => yield "<<"; case binarithm_op::LXOR => yield "^^"; case binarithm_op::MINUS => yield "-"; case binarithm_op::MODULO => yield "%"; case binarithm_op::NEQUAL => yield "!="; case binarithm_op::PLUS => yield "+"; case binarithm_op::RSHIFT => yield ">>"; case binarithm_op::TIMES => yield "*"; case binarithm_op::BXOR => yield "^"; }, synkind::OPERATOR)?; z += space(ctx)?; z += binexprval(ctx, syn, e.rvalue, prec)?; return z; case let e: ast::binding_expr => return binding_expr(ctx, syn, &e, "=")?; case let e: ast::break_expr => let z = syn(ctx, "break", synkind::KEYWORD)?; if (e != "") { z += space(ctx)?; z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e, synkind::LABEL)?; }; return z; case let e: ast::call_expr => let z = 0z; const needs_parens = !is_postfix(e.lvalue); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e.lvalue)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, "(", synkind::PUNCTUATION)?; for (let i = 0z; i < len(e.args); i += 1) { z += _expr(ctx, syn, e.args[i])?; if (i + 1 < len(e.args)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; if (e.variadic) { z += syn(ctx, "...", synkind::OPERATOR)?; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::cast_expr => let z = 0z; const needs_parens = !is_cast(e.value); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e.value)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; switch (e.kind) { case ast::cast_kind::CAST => z += syn(ctx, ":", synkind::OPERATOR)?; z += space(ctx)?; case ast::cast_kind::ASSERTION => z += space(ctx)?; z += syn(ctx, "as", synkind::OPERATOR)?; z += space(ctx)?; case ast::cast_kind::TEST => z += space(ctx)?; z += syn(ctx, "is", synkind::OPERATOR)?; z += space(ctx)?; }; z += __type(ctx, syn, e._type)?; return z; case let e: ast::literal_expr => return literal(ctx, syn, e)?; case let e: ast::continue_expr => let z = syn(ctx, "continue", synkind::KEYWORD)?; if (e != "") { z += space(ctx)?; z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e, synkind::LABEL)?; }; return z; case let e: ast::defer_expr => let z = syn(ctx, "defer", synkind::KEYWORD)?; z += space(ctx)?; z += _expr(ctx, syn, e)?; return z; case let e: ast::delete_expr => let z = 0z; if (e.is_static) { z += syn(ctx, "static", synkind::KEYWORD)?; z += space(ctx)?; }; z += syn(ctx, "delete", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.object)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::error_assert_expr => let z = 0z; const needs_parens = !is_postfix(e); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, "!", synkind::OPERATOR)?; return z; case let e: ast::for_expr => return for_expr(ctx, syn, &e)?; case let e: ast::free_expr => let z = syn(ctx, "free", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::if_expr => let z = syn(ctx, "if", synkind::KEYWORD)?; z += space(ctx)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.cond)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, e.tbranch)?; match (e.fbranch) { case null => void; case let e: *ast::expr => z += space(ctx)?; z += syn(ctx, "else", synkind::KEYWORD)?; z += space(ctx)?; z += _expr(ctx, syn, e)?; }; return z; case ast::insert_expr => return append_insert_expr(ctx, syn, e); case let e: ast::compound_expr => let z = 0z; if (e.label != "") { z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e.label, synkind::LABEL)?; z += space(ctx)?; }; z += syn(ctx, "{", synkind::PUNCTUATION)?; ctx.indent += 1; for (let expr .. e.exprs) { z += newline(ctx)?; z += stmt(ctx, syn, expr)?; }; ctx.indent -= 1; z += newline(ctx)?; z += syn(ctx, "}", synkind::PUNCTUATION)?; return z; case let e: ast::match_expr => return match_expr(ctx, syn, &e)?; case let e: ast::len_expr => let z = syn(ctx, "len", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::size_expr => let z = syn(ctx, "size", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += __type(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::offset_expr => let z = syn(ctx, "offset", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::propagate_expr => let z = 0z; const needs_parens = !is_postfix(e); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, "?", synkind::OPERATOR)?; return z; case let e: ast::return_expr => let z = syn(ctx, "return", synkind::KEYWORD)?; match (e) { case null => void; case let e: *ast::expr => z += space(ctx)?; z += _expr(ctx, syn, e)?; }; return z; case let e: ast::slice_expr => let z = 0z; const needs_parens = !is_postfix(e.object); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e.object)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; z += syn(ctx, "[", synkind::PUNCTUATION)?; match (e.start) { case null => void; case let e: *ast::expr => z += _expr(ctx, syn, e)?; }; z += syn(ctx, "..", synkind::OPERATOR)?; match (e.end) { case null => void; case let e: *ast::expr => z += _expr(ctx, syn, e)?; }; z += syn(ctx, "]", synkind::PUNCTUATION)?; return z; case let e: ast::switch_expr => return switch_expr(ctx, syn, &e)?; case let e: ast::unarithm_expr => let z = syn(ctx, switch (e.op) { case ast::unarithm_op::ADDR => yield "&"; case ast::unarithm_op::BNOT => yield "~"; case ast::unarithm_op::DEREF => yield "*"; case ast::unarithm_op::LNOT => yield "!"; case ast::unarithm_op::MINUS => yield "-"; }, synkind::OPERATOR)?; const needs_parens = match (e.operand.expr) { case let inner: ast::unarithm_expr => yield e.op == ast::unarithm_op::ADDR && inner.op == e.op; case => yield !is_unary(e.operand); }; if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e.operand)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; return z; case let e: ast::variadic_expr => match (e) { case ast::vastart_expr => let z = syn(ctx, "vastart", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::vaarg_expr => let z = syn(ctx, "vaarg", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.ap)?; z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; z += __type(ctx, syn, e._type)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; case let e: ast::vaend_expr => let z = syn(ctx, "vaend", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; }; case let e: ast::yield_expr => let z = syn(ctx, "yield", synkind::KEYWORD)?; if (e.label != "") { z += space(ctx)?; z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e.label, synkind::LABEL)?; }; match (e.value) { case null => void; case let v: *ast::expr => if (e.label != "") { z += syn(ctx, ",", synkind::PUNCTUATION)?; }; z += space(ctx)?; z += _expr(ctx, syn, v)?; }; return z; }; }; fn binprecedence(op: binarithm_op) uint = { switch (op) { case binarithm_op::DIV, binarithm_op::MODULO, binarithm_op::TIMES => return 10; case binarithm_op::MINUS, binarithm_op::PLUS => return 9; case binarithm_op::LSHIFT, binarithm_op::RSHIFT => return 8; case binarithm_op::BAND => return 7; case binarithm_op::BXOR => return 6; case binarithm_op::BOR => return 5; case binarithm_op::GT, binarithm_op::GTEQ, binarithm_op::LESS, binarithm_op::LESSEQ => return 4; case binarithm_op::LEQUAL, binarithm_op::NEQUAL => return 3; case binarithm_op::LAND => return 2; case binarithm_op::LXOR => return 1; case binarithm_op::LOR => return 0; }; }; fn binexprval( ctx: *context, syn: *synfunc, e: *ast::expr, prec: uint, ) (size | io::error) = { let z = 0z; match (e.expr) { case let b: ast::binarithm_expr => if (binprecedence(b.op) < prec) { z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; }; case => void; }; const needs_parens = !is_cast(e) && !(e.expr is ast::binarithm_expr); if (needs_parens) { z += syn(ctx, "(", synkind::PUNCTUATION)?; }; z += _expr(ctx, syn, e)?; if (needs_parens) { z += syn(ctx, ")", synkind::PUNCTUATION)?; }; return z; }; fn stmt(ctx: *context, syn: *synfunc, e: *ast::expr) (size | io::error) = { let n = _expr(ctx, syn, e)?; n += syn(ctx, ";", synkind::PUNCTUATION)?; return n; }; fn literal( ctx: *context, syn: *synfunc, e: ast::literal_expr, ) (size | io::error) = { match (e) { case void => return syn(ctx, "void", synkind::KEYWORD)?; case let v: ast::value => match (v) { case void => abort(); case ast::_null => return syn(ctx, "null", synkind::KEYWORD)?; case done => return syn(ctx, "done", synkind::KEYWORD)?; case nomem => return syn(ctx, "nomem", synkind::KEYWORD)?; case let b: bool => return syn(ctx, if (b) "true" else "false", synkind::KEYWORD)?; case let s: str => const s = strings::multireplace(s, (`\`, `\\`), (`"`, `\"`))!; defer free(s); const s = fmt::asprintf(`"{}"`, s)?; defer free(s); return syn(ctx, s, synkind::RUNE_STRING)?; case let r: rune => // 4 for unicode codepoint + 2 's let buf: [6]u8 = [0...]; if (r == '\'' || r == '\\') { return syn(ctx, fmt::bsprintf(buf, `'\{}'`, r)!, synkind::RUNE_STRING)?; } else { return syn(ctx, fmt::bsprintf(buf, "'{}'", r)!, synkind::RUNE_STRING)?; }; }; case let ac: ast::array_literal => let z = syn(ctx, "[", synkind::PUNCTUATION)?; for (let i = 0z; i < len(ac.values); i += 1) { z += _expr(ctx, syn, ac.values[i])?; if (i + 1 < len(ac.values)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; if (ac.expand) { z += syn(ctx, "...", synkind::OPERATOR)?; }; z += syn(ctx, "]", synkind::PUNCTUATION)?; return z; case let v: ast::number_literal => const s = switch (v.suff) { case ltok::LIT_U8 => yield fmt::asprintf("{}u8", v.value)?; case ltok::LIT_U16 => yield fmt::asprintf("{}u16", v.value)?; case ltok::LIT_U32 => yield fmt::asprintf("{}u32", v.value)?; case ltok::LIT_U64 => yield fmt::asprintf("{}u64", v.value)?; case ltok::LIT_UINT => yield fmt::asprintf("{}u", v.value)?; case ltok::LIT_SIZE => yield fmt::asprintf("{}z", v.value)?; case ltok::LIT_I8 => yield fmt::asprintf("{}i8", v.value)?; case ltok::LIT_I16 => yield fmt::asprintf("{}i16", v.value)?; case ltok::LIT_I32 => yield fmt::asprintf("{}i32", v.value)?; case ltok::LIT_I64 => yield fmt::asprintf("{}i64", v.value)?; case ltok::LIT_INT => yield fmt::asprintf("{}i", v.value)?; case ltok::LIT_ICONST => yield fmt::asprint(v.value)?; case ltok::LIT_FCONST => yield fmt::asprintf("{:F.}", v.value)?; case ltok::LIT_F32 => yield fmt::asprintf("{}f32", v.value)?; case ltok::LIT_F64 => yield fmt::asprintf("{}f64", v.value)?; case => abort(); }; defer free(s); return syn(ctx, s, synkind::NUMBER)?; case let sc: ast::struct_literal => return struct_literal(ctx, syn, sc)?; case let tu: ast::tuple_literal => let z = syn(ctx, "(", synkind::PUNCTUATION)?; for (let i = 0z; i < len(tu); i += 1) { z += _expr(ctx, syn, tu[i])?; if (i + 1 < len(tu)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; }; }; fn struct_literal( ctx: *context, syn: *synfunc, sc: ast::struct_literal, ) (size | io::error) = { let z = 0z; z += if (len(sc.alias) != 0) { yield _ident(ctx, syn, sc.alias, synkind::IDENT)?; } else { yield syn(ctx, "struct", synkind::KEYWORD)?; }; z += space(ctx)?; z += syn(ctx, "{", synkind::PUNCTUATION)?; ctx.indent += 1; for (let field .. sc.fields) { z += newline(ctx)?; match (field) { case let sv: ast::struct_value => z += syn(ctx, sv.name, synkind::SECONDARY)?; match (sv._type) { case null => void; case let t: *ast::_type => z += syn(ctx, ":", synkind::PUNCTUATION)?; z += space(ctx)?; z += __type(ctx, syn, t)?; }; z += space(ctx)?; z += syn(ctx, "=", synkind::OPERATOR)?; z += space(ctx)?; z += _expr(ctx, syn, sv.init)?; case let sc: *ast::struct_literal => z += literal(ctx, syn, *sc)?; }; z += syn(ctx, ",", synkind::PUNCTUATION)?; }; if (sc.autofill) { z += newline(ctx)?; z += syn(ctx, "...", synkind::OPERATOR)?; }; ctx.indent -= 1; z += newline(ctx)?; z += syn(ctx, "}", synkind::PUNCTUATION)?; return z; }; fn binding_expr( ctx: *context, syn: *synfunc, e: *ast::binding_expr, assign_op: str ) (size | io::error) = { let z = 0z; if (e.is_static) { z += syn(ctx, "static", synkind::KEYWORD)?; z += space(ctx)?; }; switch (e.kind) { case ast::binding_kind::DEF => z += syn(ctx, "def", synkind::KEYWORD)?; case ast::binding_kind::CONST => z += syn(ctx, "const", synkind::KEYWORD)?; case ast::binding_kind::LET => z += syn(ctx, "let", synkind::KEYWORD)?; }; z += space(ctx)?; for (let i = 0z; i < len(e.bindings); i += 1) { let binding = e.bindings[i]; match (binding.name) { case let s: str => z += syn(ctx, s, synkind::IDENT)?; case let u: ast::binding_unpack => z += syn(ctx, "(", synkind::PUNCTUATION)?; for (let i = 0z; i < len(u); i += 1) { match (u[i]) { case let s: str => z += syn(ctx, s, synkind::IDENT)?; case void => z += syn(ctx, "_", synkind::OPERATOR)?; }; if (i + 1 < len(u)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; }; match (binding._type) { case let t: *ast::_type => z += syn(ctx, ":", synkind::PUNCTUATION)?; z += space(ctx)?; z += __type(ctx, syn, t)?; case null => void; }; z += space(ctx)?; z += syn(ctx, assign_op, synkind::OPERATOR)?; z += space(ctx)?; z += _expr(ctx, syn, binding.init)?; if (i + 1 < len(e.bindings)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; return z; }; fn for_expr( ctx: *context, syn: *synfunc, e: *ast::for_expr, ) (size | io::error) = { let z = syn(ctx, "for", synkind::KEYWORD)?; z += space(ctx)?; if (e.label != "") { z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e.label, synkind::LABEL)?; z += space(ctx)?; }; z += syn(ctx, "(", synkind::PUNCTUATION)?; let assign_op = switch (e.kind) { case ast::for_kind::ACCUMULATOR => yield "="; case ast::for_kind::EACH_VALUE => yield ".."; case ast::for_kind::EACH_POINTER => yield "&.."; case ast::for_kind::ITERATOR => yield "=>"; }; match (e.bindings) { case let bind_expr: *ast::expr => z += binding_expr(ctx, syn, &(bind_expr.expr as ast::binding_expr), assign_op)?; if (e.kind == ast::for_kind::ACCUMULATOR) { z += syn(ctx, ";", synkind::PUNCTUATION)?; z += space(ctx)?; }; case null => void; }; if (e.kind == ast::for_kind::ACCUMULATOR) { z += _expr(ctx, syn, e.cond as *ast::expr)?; match (e.afterthought) { case null => void; case let e: *ast::expr => z += syn(ctx, ";", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, e)?; }; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, e.body)?; return z; }; fn switch_expr( ctx: *context, syn: *synfunc, e: *ast::switch_expr, ) (size | io::error) = { let z = syn(ctx, "switch", synkind::KEYWORD)?; z += space(ctx)?; if (e.label != "") { z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e.label, synkind::LABEL)?; z += space(ctx)?; }; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.value)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; z += space(ctx)?; z += syn(ctx, "{", synkind::PUNCTUATION)?; for (let item .. e.cases) { z += newline(ctx)?; z += syn(ctx, "case", synkind::KEYWORD)?; z += space(ctx)?; if (len(item.options) == 0) { z += syn(ctx, "=>", synkind::OPERATOR)?; } else { for (let j = 0z; j < len(item.options); j += 1) { const opt = item.options[j]; z += _expr(ctx, syn, opt)?; if (j + 1 < len(item.options)) { z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; }; z += space(ctx)?; z += syn(ctx, "=>", synkind::OPERATOR)?; }; z += case_exprs(ctx, syn, item.exprs)?; }; z += newline(ctx)?; z += syn(ctx, "}", synkind::PUNCTUATION)?; return z; }; fn match_expr( ctx: *context, syn: *synfunc, e: *ast::match_expr, ) (size | io::error) = { let z = syn(ctx, "match", synkind::KEYWORD)?; z += space(ctx)?; if (e.label != "") { z += syn(ctx, ":", synkind::LABEL)?; z += syn(ctx, e.label, synkind::LABEL)?; z += space(ctx)?; }; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.value)?; z += syn(ctx, ")", synkind::PUNCTUATION)?; z += space(ctx)?; z += syn(ctx, "{", synkind::PUNCTUATION)?; for (let item .. e.cases) { z += newline(ctx)?; z += syn(ctx, "case", synkind::KEYWORD)?; if (len(item.name) > 0) { z += space(ctx)?; z += syn(ctx, "let", synkind::KEYWORD)?; z += space(ctx)?; z += syn(ctx, item.name, synkind::IDENT)?; }; match (item._type) { case let typ: *ast::_type => if (len(item.name) > 0) { z += syn(ctx, ":", synkind::PUNCTUATION)?; }; z += space(ctx)?; z += __type(ctx, syn, typ)?; case null => void; }; z += space(ctx)?; z += syn(ctx, "=>", synkind::OPERATOR)?; z += case_exprs(ctx, syn, item.exprs)?; }; z += newline(ctx)?; z += syn(ctx, "}", synkind::PUNCTUATION)?; return z; }; fn case_exprs( ctx: *context, syn: *synfunc, exprs: []*ast::expr, ) (size | io::error) = { let z = 0z; if (len(exprs) == 1) match (exprs[0].expr) { case let e: ast::assert_expr => if (e.cond == null) { // abort() expression z += space(ctx)?; z += assert_expr(ctx, syn, &e)?; z += syn(ctx, ";", synkind::PUNCTUATION)?; return z; }; case let e: ast::value => if (e is void) { z += space(ctx)?; { ctx.stack = &stack { cur = exprs[0], up = ctx.stack, ... }; defer ctx.stack = (ctx.stack as *stack).up; z += syn(ctx, "void", synkind::KEYWORD)?; }; z += syn(ctx, ";", synkind::PUNCTUATION)?; return z; }; case => void; }; ctx.indent += 1; for (let expr .. exprs) { z += newline(ctx)?; z += stmt(ctx, syn, expr)?; }; ctx.indent -= 1; return z; }; fn is_plain(e: *ast::expr) bool = { match (e.expr) { case ast::literal_expr => return true; case ast::access_identifier => return true; case => return false; }; }; fn is_postfix(e: *ast::expr) bool = { if (is_builtin(e)) { return true; }; match (e.expr) { case ast::call_expr => return true; case ast::access_expr => return true; case ast::slice_expr => return true; case ast::error_assert_expr => return true; case ast::propagate_expr => return true; case => return false; }; }; fn is_builtin(e: *ast::expr) bool = { if (is_plain(e)) { return true; }; match (e.expr) { case ast::alloc_expr => return true; case ast::assert_expr => return true; case ast::variadic_expr => return true; // measurement-expression case ast::len_expr => return true; case ast::align_expr => return true; case ast::size_expr => return true; case ast::offset_expr => return true; // slice-mutation-expression case ast::append_expr => return true; case ast::insert_expr => return true; case => return false; }; }; fn is_unary(e: *ast::expr) bool = { if (is_postfix(e)) { return true; }; match (e.expr) { case ast::compound_expr => return true; case ast::match_expr => return true; case ast::switch_expr => return true; case ast::unarithm_expr => return true; case => return false; }; }; fn is_cast(e: *ast::expr) bool = { return is_unary(e) || (e.expr is ast::cast_expr); }; fn assert_expr( ctx: *context, syn: *synfunc, e: *ast::assert_expr, ) (size | io::error) = { let z = 0z; if (e.is_static) { z += syn(ctx, "static", synkind::KEYWORD)?; z += space(ctx)?; }; // assert without a condition = abort match (e.cond) { case let e: *ast::expr => z += syn(ctx, "assert", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e)?; case null => z += syn(ctx, "abort", synkind::KEYWORD)?; z += syn(ctx, "(", synkind::PUNCTUATION)?; }; match (e.message) { case let m: *ast::expr => match (e.cond) { case null => void; case *ast::expr => z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; }; z += _expr(ctx, syn, m)?; case null => void; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; }; fn append_insert_expr( ctx: *context, syn: *synfunc, e: *ast::expr, ) (size | io::error) = { let z = 0z; const e: *ast::append_expr = match (e.expr) { case let e: ast::append_expr => if (e.is_static) { z += syn(ctx, "static", synkind::KEYWORD)?; z += space(ctx)?; }; z += syn(ctx, "append", synkind::KEYWORD)?; yield &e; case let e: ast::insert_expr => if (e.is_static) { z += syn(ctx, "static", synkind::KEYWORD)?; z += space(ctx)?; }; z += syn(ctx, "insert", synkind::KEYWORD)?; yield &e; case => abort(); // unreachable }; z += syn(ctx, "(", synkind::PUNCTUATION)?; z += _expr(ctx, syn, e.object)?; z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, e.value)?; if (e.variadic) { z += syn(ctx, "...", synkind::OPERATOR)?; }; match (e.length) { case null => void; case let l: *ast::expr => z += syn(ctx, ",", synkind::PUNCTUATION)?; z += space(ctx)?; z += _expr(ctx, syn, l)?; }; z += syn(ctx, ")", synkind::PUNCTUATION)?; return z; }; hare-update-0.25.2.0/v0_25_2/unparse/ident.ha000066400000000000000000000017171503370650000203100ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use v0_25_2::ast; use io; use memio; // Unparses an identifier. export fn ident(out: io::handle, id: ast::ident) (size | io::error) = { let ctx = context { out = out, ... }; return _ident(&ctx, &syn_nowrap, id, synkind::IDENT); }; fn _ident( ctx: *context, syn: *synfunc, id: ast::ident, kind: synkind, ) (size | io::error) = { let n = 0z; for (let i = 0z; i < len(id); i += 1) { n += syn(ctx, id[i], kind)?; if (i + 1 < len(id)) { n += syn(ctx, "::", kind)?; }; }; return n; }; // Unparses an identifier into a string. The caller must free the return value. export fn identstr(id: ast::ident) str = { let buf = memio::dynamic(); ident(&buf, id)!; return memio::string(&buf)!; }; @test fn ident() void = { let s = identstr(["foo", "bar", "baz"]); defer free(s); assert(s == "foo::bar::baz"); let s = identstr(["foo"]); defer free(s); assert(s == "foo"); }; hare-update-0.25.2.0/v0_25_2/unparse/import.ha000066400000000000000000000041001503370650000205040ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use v0_25_2::ast; use io; use memio; // Unparses a [[hare::ast::import]]. export fn import( out: io::handle, syn: *synfunc, import: *ast::import, ) (size | io::error) = { let n = 0z; let ctx = context { out = out, stack = &stack { cur = import, ... }, ... }; n += syn(&ctx, "use", synkind::KEYWORD)?; n += space(&ctx)?; match (import.bindings) { case void => n += _ident(&ctx, syn, import.ident, synkind::IDENT)?; case let alias: ast::import_alias => n += syn(&ctx, alias, synkind::IMPORT_ALIAS)?; n += space(&ctx)?; n += syn(&ctx, "=", synkind::OPERATOR)?; n += space(&ctx)?; n += _ident(&ctx, syn, import.ident, synkind::IDENT)?; case let objects: ast::import_members => n += _ident(&ctx, syn, import.ident, synkind::IDENT)?; n += syn(&ctx, "::", synkind::IDENT)?; n += syn(&ctx, "{", synkind::PUNCTUATION)?; for (let i = 0z; i < len(objects); i += 1) { n += syn(&ctx, objects[i], synkind::SECONDARY)?; if (i + 1 < len(objects)) { n += syn(&ctx, ",", synkind::PUNCTUATION)?; n += space(&ctx)?; }; }; n += syn(&ctx, "}", synkind::PUNCTUATION)?; case ast::import_wildcard => n += _ident(&ctx, syn, import.ident, synkind::IDENT)?; n += syn(&ctx, "::", synkind::IDENT)?; n += syn(&ctx, "*", synkind::PUNCTUATION)?; }; n += syn(&ctx, ";", synkind::PUNCTUATION)?; return n; }; @test fn import() void = { let tests: [_](ast::import, str) = [ (ast::import { ident = ["foo", "bar", "baz"], bindings = void, ... }, "use foo::bar::baz;"), (ast::import { ident = ["foo"], bindings = "bar", ... }, "use bar = foo;"), (ast::import { ident = ["foo"], bindings = ["bar", "baz"], ... }, "use foo::{bar, baz};"), (ast::import { ident = ["foo", "bar"], bindings = ast::import_wildcard, ... }, "use foo::bar::*;"), ]; for (let (ast_import, str_import) .. tests) { let buf = memio::dynamic(); import(&buf, &syn_nowrap, &ast_import)!; let s = memio::string(&buf)!; assert(s == str_import); free(s); }; }; hare-update-0.25.2.0/v0_25_2/unparse/syn.ha000066400000000000000000000123571503370650000200200ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use fmt; use v0_25_2::ast; use io; // A user-supplied function which writes unparsed Hare source code to a handle, // optionally including extra stylistic features. The function is expected to // write to, at the minimum, write the provided string to ctx.out, and update // ctx.linelen based on how much data was written. // // [[syn_nowrap]] and [[syn_wrap]] are provided for when no additional styling // is desired. export type synfunc = fn( ctx: *context, s: str, kind: synkind, ) (size | io::error); // The kind of thing being unparsed. export type synkind = enum { IDENT, COMMENT, CONSTANT, FUNCTION, GLOBAL, TYPEDEF, IMPORT_ALIAS, SECONDARY, KEYWORD, TYPE, ATTRIBUTE, OPERATOR, PUNCTUATION, RUNE_STRING, NUMBER, LABEL, }; // Context about the unparsing state supplied to a [[synfunc]]. The linelen and // indent fields may be mutated. export type context = struct { out: io::handle, stack: nullable *stack, linelen: size, indent: size, }; // A linked list of AST nodes currently being unparsed. export type stack = struct { cur: (*ast::decl | *ast::expr | *ast::_type | *ast::import), up: nullable *stack, extra: nullable *opaque, }; // A [[synfunc]] implementation which unparses without additional styling, and // without wrapping any long lines. export fn syn_nowrap( ctx: *context, s: str, kind: synkind, ) (size | io::error) = { const z = fmt::fprint(ctx.out, s)?; ctx.linelen += z; return z; }; type syn_wrap_extra = enum { NONE, MULTILINE_FN_PARAM, MULTILINE_FN_OTHER, MULTILINE_TAGGED_OR_TUPLE, }; // A [[synfunc]] implementation which unparses without additional styling, but // which wraps some long lines at 80 columns, in accordance with the style // guide. export fn syn_wrap(ctx: *context, s: str, kind: synkind) (size | io::error) = { let extra = :extra { let st = match (ctx.stack) { case let st: *stack => yield st; case null => yield :extra, &syn_wrap_extra::NONE; }; match (st.extra) { case let p: *opaque => yield :extra, p: *syn_wrap_extra; case null => match (st.up) { case let st: *stack => match (st.extra) { case let p: *opaque => const p = p: *syn_wrap_extra; if (*p == syn_wrap_extra::MULTILINE_FN_PARAM) { yield :extra, p; }; case null => void; }; case null => void; }; }; if (s == "(") match (st.cur) { case let t: *ast::_type => match (t.repr) { case ast::func_type => void; case => yield :extra, &syn_wrap_extra::NONE; }; let z = _type(io::empty, &syn_nowrap, t)!; if (ctx.linelen + z < 80) yield; st.extra = alloc(syn_wrap_extra::MULTILINE_FN_PARAM)!; z = fmt::fprintln(ctx.out, s)?; ctx.linelen = 0; ctx.indent += 1; return z; case => yield :extra, &syn_wrap_extra::NONE; }; // use 72 as max linelen instead of 80 to give a bit of leeway. // XXX: this probably could be made more accurate if (ctx.linelen < 72 || (s != "," && s != "|")) { yield :extra, &syn_wrap_extra::NONE; }; const t = match (st.cur) { case let t: *ast::_type => yield t; case => yield :extra, &syn_wrap_extra::NONE; }; match (t.repr) { case (ast::tagged_type | ast::tuple_type) => void; case => yield :extra, &syn_wrap_extra::NONE; }; st.extra = alloc(syn_wrap_extra::MULTILINE_TAGGED_OR_TUPLE)!; let z = fmt::fprintln(ctx.out, s)?; ctx.indent += 1; ctx.linelen = ctx.indent * 8; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; }; return z; }; let z = 0z; switch (*extra) { case syn_wrap_extra::NONE => void; case syn_wrap_extra::MULTILINE_FN_PARAM => switch (s) { case ")" => match (ctx.stack) { case let st: *stack => free(st.extra); st.extra = null; case null => void; }; ctx.indent -= 1; case "..." => match (ctx.stack) { case let st: *stack => free(st.extra); st.extra = null; case null => void; }; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; }; z += fmt::fprintln(ctx.out, s)?; ctx.indent -= 1; ctx.linelen = 0; return z; case => *extra = syn_wrap_extra::MULTILINE_FN_OTHER; ctx.linelen = ctx.indent * 8; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; }; }; case syn_wrap_extra::MULTILINE_FN_OTHER => switch (s) { case ")" => match (ctx.stack) { case let st: *stack => free(st.extra); st.extra = null; case null => void; }; ctx.indent -= 1; ctx.linelen = ctx.indent * 8; z += fmt::fprintln(ctx.out, ",")?; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; }; case ",", "..." => *extra = syn_wrap_extra::MULTILINE_FN_PARAM; ctx.linelen = 0; return fmt::fprintln(ctx.out, s)?; case => void; }; case syn_wrap_extra::MULTILINE_TAGGED_OR_TUPLE => switch (s) { case ")" => let st = ctx.stack as *stack; free(st.extra); st.extra = null; ctx.indent -= 1; case ",", "|" => if (ctx.linelen < 72) yield; z += fmt::fprintln(ctx.out, s)?; ctx.linelen = ctx.indent * 8; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; }; return z; case => void; }; }; z += syn_nowrap(ctx, s, kind)?; return z; }; hare-update-0.25.2.0/v0_25_2/unparse/type.ha000066400000000000000000000203621503370650000201630ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use fmt; use v0_25_2::ast; use v0_25_2::ast::{variadism}; use v0_25_2::lex; use io; use memio; use strings; // Returns a builtin type as a string. export fn builtin_type(b: ast::builtin_type) str = switch (b) { case ast::builtin_type::FCONST, ast::builtin_type::ICONST, ast::builtin_type::RCONST => abort("ICONST, FCONST, and RCONST have no lexical representation"); case ast::builtin_type::BOOL => yield "bool"; case ast::builtin_type::DONE => yield "done"; case ast::builtin_type::F32 => yield "f32"; case ast::builtin_type::F64 => yield "f64"; case ast::builtin_type::I16 => yield "i16"; case ast::builtin_type::I32 => yield "i32"; case ast::builtin_type::I64 => yield "i64"; case ast::builtin_type::I8 => yield "i8"; case ast::builtin_type::INT => yield "int"; case ast::builtin_type::NEVER => yield "never"; case ast::builtin_type::NOMEM => yield "nomem"; case ast::builtin_type::NULL => yield "null"; case ast::builtin_type::OPAQUE => yield "opaque"; case ast::builtin_type::RUNE => yield "rune"; case ast::builtin_type::SIZE => yield "size"; case ast::builtin_type::STR => yield "str"; case ast::builtin_type::U16 => yield "u16"; case ast::builtin_type::U32 => yield "u32"; case ast::builtin_type::U64 => yield "u64"; case ast::builtin_type::U8 => yield "u8"; case ast::builtin_type::UINT => yield "uint"; case ast::builtin_type::UINTPTR => yield "uintptr"; case ast::builtin_type::VALIST => yield "valist"; case ast::builtin_type::VOID => yield "void"; }; fn prototype( ctx: *context, syn: *synfunc, t: *ast::func_type, ) (size | io::error) = { let n = 0z; n += syn(ctx, "(", synkind::PUNCTUATION)?; for (let i = 0z; i < len(t.params); i += 1) { const param = &t.params[i]; if (param.name != "") { n += syn(ctx, param.name, synkind::SECONDARY)?; n += syn(ctx, ":", synkind::PUNCTUATION)?; n += space(ctx)?; }; n += __type(ctx, syn, param._type)?; match (param.default_value) { case void => yield; case let e: ast::expr => n += space(ctx)?; n += syn(ctx, "=", synkind::PUNCTUATION)?; n += space(ctx)?; n += _expr(ctx, syn, &e)?; }; if (i + 1 < len(t.params) || t.variadism == variadism::C) { n += syn(ctx, ",", synkind::PUNCTUATION)?; n += space(ctx)?; }; }; if (t.variadism != variadism::NONE) { n += syn(ctx, "...", synkind::OPERATOR)?; }; n += syn(ctx, ")", synkind::PUNCTUATION)?; n += space(ctx)?; n += __type(ctx, syn, t.result)?; return n; }; fn struct_union_type( ctx: *context, syn: *synfunc, t: *ast::_type, ) (size | io::error) = { let z = 0z; let membs = match (t.repr) { case let st: ast::struct_type => z += syn(ctx, "struct", synkind::TYPE)?; z += space(ctx)?; if (st.packed) { z += syn(ctx, "@packed", synkind::ATTRIBUTE)?; z += space(ctx)?; }; z += syn(ctx, "{", synkind::PUNCTUATION)?; yield st.members: []ast::struct_member; case let ut: ast::union_type => z += syn(ctx, "union", synkind::TYPE)?; z += space(ctx)?; z += syn(ctx, "{", synkind::PUNCTUATION)?; yield ut: []ast::struct_member; case => abort(); // unreachable }; ctx.indent += 1z; for (let memb .. membs) { z += fmt::fprintln(ctx.out)?; ctx.linelen = 0; if (memb.docs != "") { z += comment(ctx, syn, memb.docs)?; }; for (let i = 0z; i < ctx.indent; i += 1) { z += fmt::fprint(ctx.out, "\t")?; ctx.linelen += 8; }; match (memb._offset) { case null => void; case let ex: *ast::expr => z += syn(ctx, "@offset(", synkind::ATTRIBUTE)?; z += _expr(ctx, syn, ex)?; z += syn(ctx, ")", synkind::ATTRIBUTE)?; z += space(ctx)?; }; match (memb.member) { case let se: ast::struct_embedded => z += __type(ctx, syn, se)?; case let sa: ast::struct_alias => z += _ident(ctx, syn, sa, synkind::IDENT)?; case let sf: ast::struct_field => z += syn(ctx, sf.name, synkind::SECONDARY)?; z += syn(ctx, ":", synkind::PUNCTUATION)?; z += space(ctx)?; z += __type(ctx, syn, sf._type)?; }; z += syn(ctx, ",", synkind::PUNCTUATION)?; }; ctx.indent -= 1; z += newline(ctx)?; z += syn(ctx, "}", synkind::PUNCTUATION)?; return z; }; fn multiline_comment(s: str) bool = strings::byteindex(s, '\n') as size != len(s) - 1; // Unparses a [[hare::ast::_type]]. export fn _type( out: io::handle, syn: *synfunc, t: *ast::_type, ) (size | io::error) = { let ctx = context { out = out, ... }; return __type(&ctx, syn, t); }; fn __type(ctx: *context, syn: *synfunc, t: *ast::_type) (size | io::error) = { ctx.stack = &stack { cur = t, up = ctx.stack, ... }; defer { let stack = &(ctx.stack as *stack); match (stack.extra) { case let p: *opaque => free(p); case null => void; }; ctx.stack = stack.up; }; let n = 0z; if (t.flags & ast::type_flag::CONST != 0) { n += syn(ctx, "const", synkind::TYPE)?; n += space(ctx)?; }; if (t.flags & ast::type_flag::ERROR != 0) { n += syn(ctx, "!", synkind::TYPE)?; }; match (t.repr) { case let a: ast::alias_type => if (a.unwrap) { n += syn(ctx, "...", synkind::TYPE)?; }; n += _ident(ctx, syn, a.ident, synkind::TYPE)?; case let b: ast::builtin_type => n += syn(ctx, builtin_type(b), synkind::TYPE)?; case let e: ast::enum_type => n += syn(ctx, "enum", synkind::TYPE)?; n += space(ctx)?; if (e.storage != ast::builtin_type::INT) { n += syn(ctx, builtin_type(e.storage), synkind::TYPE)?; n += space(ctx)?; }; n += syn(ctx, "{", synkind::PUNCTUATION)?; ctx.indent += 1; n += fmt::fprintln(ctx.out)?; ctx.linelen = 0; for (let value .. e.values) { let wrotedocs = false; if (value.docs != "") { // Check if comment should go above or next to // field if (multiline_comment(value.docs)) { n += comment(ctx, syn, value.docs)?; wrotedocs = true; }; }; for (let i = 0z; i < ctx.indent; i += 1) { n += fmt::fprint(ctx.out, "\t")?; ctx.linelen += 8; }; n += syn(ctx, value.name, synkind::SECONDARY)?; match (value.value) { case null => void; case let e: *ast::expr => n += space(ctx)?; n += syn(ctx, "=", synkind::OPERATOR)?; n += space(ctx)?; n += _expr(ctx, syn, e)?; }; n += syn(ctx, ",", synkind::PUNCTUATION)?; if (value.docs != "" && !wrotedocs) { n += space(ctx)?; const oldindent = ctx.indent; ctx.indent = 0; n += comment(ctx, syn, value.docs)?; ctx.indent = oldindent; } else { n += fmt::fprintln(ctx.out)?; ctx.linelen = 0; }; }; ctx.indent -= 1; for (let i = 0z; i < ctx.indent; i += 1) { n += fmt::fprint(ctx.out, "\t")?; ctx.linelen += 8; }; n += syn(ctx, "}", synkind::PUNCTUATION)?; case let f: ast::func_type => n += syn(ctx, "fn", synkind::TYPE)?; n += prototype(ctx, syn, &f)?; case let l: ast::list_type => n += syn(ctx, "[", synkind::TYPE)?; match (l.length) { case ast::len_slice => void; case ast::len_unbounded => n += syn(ctx, "*", synkind::TYPE)?; case ast::len_contextual => n += syn(ctx, "_", synkind::TYPE)?; case let e: *ast::expr => n += _expr(ctx, syn, e)?; }; n += syn(ctx, "]", synkind::TYPE)?; n += __type(ctx, syn, l.members)?; case let p: ast::pointer_type => if (p.flags & ast::pointer_flag::NULLABLE != 0) { n += syn(ctx, "nullable", synkind::TYPE)?; n += space(ctx)?; }; n += syn(ctx, "*", synkind::TYPE)?; n += __type(ctx, syn, p.referent)?; case ast::struct_type => n += struct_union_type(ctx, syn, t)?; case ast::union_type => n += struct_union_type(ctx, syn, t)?; case let t: ast::tagged_type => n += syn(ctx, "(", synkind::TYPE)?; for (let i = 0z; i < len(t); i += 1) { n += __type(ctx, syn, t[i])?; if (i + 1 == len(t)) break; n += space(ctx)?; n += syn(ctx, "|", synkind::TYPE)?; n += space(ctx)?; }; n += syn(ctx, ")", synkind::TYPE)?; case let t: ast::tuple_type => n += syn(ctx, "(", synkind::TYPE)?; for (let i = 0z; i < len(t); i += 1) { n += __type(ctx, syn, t[i])?; if (i + 1 == len(t)) break; n += syn(ctx, ",", synkind::TYPE)?; n += space(ctx)?; }; n += syn(ctx, ")", synkind::TYPE)?; }; return n; }; fn type_test(t: *ast::_type, expected: str) void = { let buf = memio::dynamic(); _type(&buf, &syn_nowrap, t)!; let s = memio::string(&buf)!; defer free(s); if (s != expected) { fmt::errorfln("=== wanted\n{}", expected)!; fmt::errorfln("=== got\n{}", s)!; abort(); }; }; hare-update-0.25.2.0/v0_25_2/unparse/unit.ha000066400000000000000000000011271503370650000201570ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use fmt; use v0_25_2::ast; use io; // Unparses a [[hare::ast::subunit]]. export fn subunit( out: io::handle, syn: *synfunc, s: ast::subunit, ) (size | io::error) = { let n = 0z; for (let imp &.. s.imports) { n += import(out, syn, imp)?; n += fmt::fprintln(out)?; }; if (len(s.imports) > 0) { n += fmt::fprintln(out)?; }; for (let i = 0z; i < len(s.decls); i += 1) { n += decl(out, syn, &s.decls[i])?; if (i < len(s.decls) - 1) n += fmt::fprintln(out)?; n += fmt::fprintln(out)?; }; return n; }; hare-update-0.25.2.0/v0_25_2/unparse/util.ha000066400000000000000000000007531503370650000201610ustar00rootroot00000000000000// SPDX-License-Identifier: MPL-2.0 // (c) Hare authors use fmt; use io; fn newline(ctx: *context) (size | io::error) = { let n = 0z; n += fmt::fprint(ctx.out, "\n")?; ctx.linelen = 0; for (let i = 0z; i < ctx.indent; i += 1) { n += fmt::fprint(ctx.out, "\t")?; ctx.linelen += 8; }; return n; }; fn space(ctx: *context) (size | io::error) = { if (ctx.linelen <= ctx.indent * 8) { return 0z; }; ctx.linelen += 1; return fmt::fprint(ctx.out, " "); }; hare-update-0.25.2.0/versions/000077500000000000000000000000001503370650000160045ustar00rootroot00000000000000hare-update-0.25.2.0/versions/all.ha000066400000000000000000000006711503370650000170720ustar00rootroot00000000000000use errors; use rules; // Supported versions: use versions::v0_25_2; // List of all supported Hare versions. export const versions = [ &v0_25_2::v0_25_2, ]; // Latest supported Hare version. export const latest = &v0_25_2::v0_25_2; // Gets a specific Hare version. export fn get(name: str) (*rules::version | errors::noentry) = { for (let ver .. versions) { if (ver.name == name) { return ver; }; }; return errors::noentry; }; hare-update-0.25.2.0/versions/v0_25_2/000077500000000000000000000000001503370650000170605ustar00rootroot00000000000000hare-update-0.25.2.0/versions/v0_25_2/.gitignore000066400000000000000000000000131503370650000210420ustar00rootroot00000000000000v0.next.ha hare-update-0.25.2.0/versions/v0_25_2/moved.ha000066400000000000000000000032101503370650000205000ustar00rootroot00000000000000use sort; use v0_25_2::ast; use v0_25_2::parse; // (Sorted!) list of all standard library from time::chrono which were moved to // time::date. const time_chrono_moved_idents: [](str, str) = [ ("chrono::EARTH_DAY", "date::EARTH_DAY"), ("chrono::GPS", "date::GPS"), ("chrono::LOCAL", "date::LOCAL"), ("chrono::MARS_SOL_MARTIAN", "date::MARS_SOL_MARTIAN"), ("chrono::MARS_SOL_TERRESTRIAL", "date::MARS_SOL_TERRESTRIAL"), ("chrono::MTC", "date::MTC"), ("chrono::TAI", "date::TAI"), ("chrono::TT", "date::TT"), ("chrono::UTC", "date::UTC"), ("chrono::coincident", "date::coincident"), ("chrono::daydate", "date::daydate"), ("chrono::daytime", "date::daytime"), ("chrono::fixedzone", "date::fixedzone"), ("chrono::from_datetime", "date::from_datetime"), ("chrono::in", "date::in"), ("chrono::invalidtzif", "date::invalidtzif"), ("chrono::locality", "date::locality"), ("chrono::ozone", "date::zone"), ("chrono::timezone", "date::timezone"), ("chrono::transition", "date::zonetransition"), ("chrono::tz", "date::tzdb"), ("chrono::tzdberror", "date::tzdberror"), ("chrono::zone", "date::zonephase"), ("chrono::zone_finish", "date::zone_finish"), ]; let time_chrono_moved: [](ast::ident, str) = []; @init fn init() void = { for (let id .. time_chrono_moved_idents) { const (id, replacement) = id; const id = parse::identstr(id)!; append(time_chrono_moved, (id, replacement))!; }; assert(sort::sorted(time_chrono_moved, size((ast::ident, str)), &id_replacement_cmp)); }; fn id_replacement_cmp(a: const *opaque, b: const *opaque) int = { const a = a: *(ast::ident, str); const b = b: *(ast::ident, str); return ident_cmp(&a.0, &b.0); }; hare-update-0.25.2.0/versions/v0_25_2/nomem.ha000066400000000000000000000032231503370650000205050ustar00rootroot00000000000000use sort; use v0_25_2::ast; use v0_25_2::parse; // (Sorted!) list of all standard library functions that can now return nomem const stdlib_nomem_identstr: []str = [ "ascii::strlower", "ascii::strlower_buf", "ascii::strupper", "ascii::strupper_buf", "base32::decodestr", "base32::encodeslice", "base32::encodestr", "base64::decodestr", "base64::encodeslice", "base64::encodestr", "c::fromstr", "c::fromstr_buf", "debug::trace_store", "dial::registerproto", "dial::registersvc", "dial::resolve", "dns::decode", "dns::parse_domain", "dns::query", "dns::unpasre_domain", "fmt::asprint", "fmt::asprintf", "fmt::bsprint", "fmt::bsprintf", "glob::glob", "glob::next", "hex::decodestr", "hex::encodestr", "hosts::host_dup", "hosts::lookup", "hosts::next", "ini::entry_dup", "ini::next", "os::readdir", "passwd::getgid", "passwd::getgroup", "passwd::getgroups", "passwd::getuid", "passwd::getuser", "regex::compile", "regex::find", "regex::findall", "regex::rawreplace", "regex::rawreplacen", "regex::replace", "regex::replacen", "regex::test", "resolvconf::next", "shlex::split", "sort::sort", "strconv::ftos", "strings::concat", "strings::dup", "strings::dupall", "strings::fromrunes", "strings::join", "strings::lpad", "strings::multireplace", "strings::replace", "strings::rpad", "strings::rsplitn", "strings::split", "strings::splitn", "strings::torunes", "wordexp::wordexp" ]; let stdlib_nomem: []ast::ident = []; @init fn init() void = { for (let id .. stdlib_nomem_identstr) { const id = parse::identstr(id)!; append(stdlib_nomem, id)!; }; assert(sort::sorted(stdlib_nomem, size(ast::ident), &ident_cmp)); }; hare-update-0.25.2.0/versions/v0_25_2/utils.ha000066400000000000000000000067201503370650000205370ustar00rootroot00000000000000use common; use common::{nonterminal}; use fmt; use rules; use sort; use strings; use v0_25_2; use v0_25_2::ast; use v0_25_2::lex; use v0_25_2::parse; use v0_25_2::unparse; fn match_access(expr: *ast::expr, ident: str) bool = { const access = match (expr.expr) { case let expr: ast::access_expr => yield expr; case => return false; }; const value = match (access) { case let ident: ast::access_identifier => yield ident; case => return false; }; const ident = parse::identstr(ident)!; defer ast::ident_free(ident); return ast::ident_eq(ident, value); }; let result_type: nullable *ast::_type = null; @init fn prototype_hook() void = { parse::register_hook(nonterminal::PROTOTYPE, &on_prototype, null); }; fn on_prototype( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { match (result_type) { case let t: *ast::_type => ast::type_finish(t); case null => void; }; const p = parse::prototype(lex)?; result_type = p.result; }; fn ensure_nomem(ctx: *rules::context, eg: *rules::editgroup) void = { const rt = result_type as *ast::_type; match (&rt.repr) { case let tagged: *ast::tagged_type => let has_nomem = false; for (let subtype .. tagged) { const subtype = match (subtype.repr) { case let b: ast::builtin_type => yield b; case => continue; }; if (subtype == ast::builtin_type::NOMEM) { has_nomem = true; break; }; }; if (has_nomem) { return; }; append(tagged, alloc(ast::_type { repr = ast::builtin_type::NOMEM, ... })!)!; rules::edit_insert(eg, rt.end, " | nomem"); case => rules::edit_insert(eg, rt.start, "("); rules::edit_append(eg, rt.end, " | nomem)"); }; }; let imports: []ast::import = []; let imports_sorted = false; @init fn import_hook() void = { parse::register_hook(nonterminal::IMPORTS, &on_imports, null); }; fn on_imports( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { ast::imports_finish(imports); imports = parse::imports(lex)?; imports_sorted = sort::sorted(imports, size(ast::import), &import_cmp); }; fn ensure_import( ctx: *rules::context, eg: *rules::editgroup, ns: ast::ident, ) void = { for (let import .. imports) { if (ast::ident_eq(import.ident, ns)) { return; }; }; const new = unparse::identstr(ns); defer free(new); const import = fmt::asprintf("use {};\n", new)!; defer free(import); let new_import = ast::import { ident = ns, bindings = void, ... }; let insert_before = 0z; if (imports_sorted) { insert_before = sort::lbisect(imports, size(ast::import), &new_import, &import_cmp); }; const loc = imports[insert_before].start; rules::edit_insert(eg, loc, import); new_import.start = loc; new_import.end = loc; // XXX rules::edit_onmerge(eg, &merge_import, alloc(new_import)!); }; fn merge_import(eg: *rules::editgroup, user: nullable *opaque) void = { const import = user: *ast::import; append(imports, *import)!; }; fn import_cmp(a: const *opaque, b: const *opaque) int = { const a = a: const *ast::import, b = b: const *ast::import; return ident_cmp(&a.ident, &b.ident); }; fn ident_cmp(a: const *opaque, b: const *opaque) int = { const a = a: const *ast::ident, b = b: const *ast::ident; for (let i = 0z; i < len(a) && i < len(b); i += 1) { let cmp = strings::compare(a[i], b[i]); if (cmp != 0) { return cmp; }; }; if (len(a) < len(b)) { return -1; } else if (len(a) == len(b)) { return 0; } else { return 1; }; }; hare-update-0.25.2.0/versions/v0_25_2/v0.25.2.ha000066400000000000000000000433611503370650000203130ustar00rootroot00000000000000// Code generated by hare-update-genrules // Do not edit by hand! use common::{ltok, nonterminal}; use glue; use io; use rules; use rules::{getvar}; use fmt; use sort; use v0_25_2; use v0_25_2::ast; use v0_25_2::lex; use v0_25_2::parse; let rule_1 = rules::rule { serial = 1, name = "errors::nomem has been removed", hooks = [nonterminal::IDENTIFIER], exec = &rule_1_exec, remember = -1, }; fn rule_1_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_1; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "errors::nomem", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const edit = { let __eg = &rules::editgroup { rule = &rule_1, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "nomem"); yield __eg; }; if (rules::present(ctx, edit, "Replace with nomem built-in")?) { rules::merge_edits(ctx, edit); }; }; let rule_2 = rules::rule { serial = 2, name = "Allocations may return nomem errors", hooks = [nonterminal::ALLOC_EXPRESSION, nonterminal::APPEND_EXPRESSION, nonterminal::INSERT_EXPRESSION], exec = &rule_2_exec, remember = -1, }; fn rule_2_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_2; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "alloc${:\"balanced\"}${l:\"location\"}", "append${:\"balanced\"}${l:\"location\"}", "insert${:\"balanced\"}${l:\"location\"}", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => return; case => void; }; const assertion = { let __eg = &rules::editgroup { rule = &rule_2, ... }; rules::edit_insert(__eg, getvar(&__captures, "l").start, "!"); yield __eg; }; const propagate = { let __eg = &rules::editgroup { rule = &rule_2, ... }; ensure_nomem(ctx, __eg); rules::edit_insert(__eg, getvar(&__captures, "l").start, "?"); yield __eg; }; match (rules::choose(ctx, &rule_2, __location, ("Add an error assertion when out of memory", assertion), ("Propagate the nomem error to the caller", propagate), )?) { case void => return; case let edit: *rules::editgroup => rules::merge_edits(ctx, edit); }; rules::warning(ctx, rule, "You should review your code for possible memory leaks when returning nomem\nand leaking objects allocated prior to the memory allocation failure."); }; let rule_3 = rules::rule { serial = 3, name = "Allocations may return nomem errors", hooks = [nonterminal::CALL_EXPRESSION], exec = &rule_3_exec, remember = -1, }; fn rule_3_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_3; const lvalue = data: *ast::expr; const lvalue = match (lvalue.expr) { case let expr: ast::access_expr => yield expr; case => return; }; const lvalue = match (lvalue) { case let ident: ast::access_identifier => yield ident; case => return; }; if (sort::search( stdlib_nomem, size(ast::ident), &lvalue, &ident_cmp) is void) { return; }; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "${:\"balanced\"}${l:\"location\"}", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => return; case => void; }; const assertion = { let __eg = &rules::editgroup { rule = &rule_3, ... }; rules::edit_insert(__eg, getvar(&__captures, "l").start, "!"); yield __eg; }; const propagate = { let __eg = &rules::editgroup { rule = &rule_3, ... }; ensure_nomem(ctx, __eg); rules::edit_insert(__eg, getvar(&__captures, "l").start, "?"); yield __eg; }; match (rules::choose(ctx, &rule_3, __location, ("Add an error assertion when out of memory", assertion), ("Propagate the nomem error to the caller", propagate), )?) { case void => return; case let edit: *rules::editgroup => rules::merge_edits(ctx, edit); }; }; let rule_4 = rules::rule { serial = 4, name = "time::unix has been deprecated", hooks = [nonterminal::CALL_EXPRESSION], exec = &rule_4_exec, remember = -1, }; fn rule_4_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_4; const lvalue = data: *ast::expr; if (!match_access(lvalue, "time::unix")) { return; }; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "(${obj:\"expression\"})", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const edit = { let __eg = &rules::editgroup { rule = &rule_4, ... }; rules::edit_replace(__eg, lvalue.start, __captures.end, getvar(&__captures, "obj").text); rules::edit_append(__eg, getvar(&__captures, "obj").end, ".sec"); yield __eg; }; if (rules::present(ctx, edit, "Replace with time::instant.sec")?) { rules::merge_edits(ctx, edit); }; }; let rule_5 = rules::rule { serial = 5, name = "time::from_unix has been deprecated", hooks = [nonterminal::IDENTIFIER], exec = &rule_5_exec, remember = -1, }; fn rule_5_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_5; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "time::from_unix", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const edit = { let __eg = &rules::editgroup { rule = &rule_5, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "time::new"); yield __eg; }; if (rules::present(ctx, edit, "Replace with time::new")?) { rules::merge_edits(ctx, edit); }; }; let rule_6 = rules::rule { serial = 6, name = "time::date::simultaneous has been deprecated", hooks = [nonterminal::CALL_EXPRESSION], exec = &rule_6_exec, remember = -1, }; fn rule_6_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_6; const lvalue = data: *ast::expr; if (!match_access(lvalue, "date::simultaneous")) { return; }; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "(${:\"expression\"}, ${:\"expression\"})", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const edit = { let __eg = &rules::editgroup { rule = &rule_6, ... }; ensure_import(ctx, __eg, ["time", "chrono"]); rules::edit_replace(__eg, lvalue.start, lvalue.end, "chrono::compare"); rules::edit_insert(__eg, __captures.end, " == 0"); yield __eg; }; if (rules::present(ctx, edit, "Replace with time::chrono::compare")?) { rules::merge_edits(ctx, edit); }; }; let rule_7 = rules::rule { serial = 7, name = "The return value of time::chrono::fixedzone must be freed after use", hooks = [nonterminal::CALL_EXPRESSION], exec = &rule_7_exec, remember = -1, }; fn rule_7_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_7; const lvalue = data: *ast::expr; if (!match_access(lvalue, "chrono::fixedzone")) { return; }; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "${:\"balanced\"}${l:\"location\"}", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const newline = rules::line_end(ctx, getvar(&__captures, "l").start); const todo = { let __eg = &rules::editgroup { rule = &rule_7, ... }; rules::edit_insert(__eg, newline, " // TODO: Free with chrono::timezone_free"); yield __eg; }; if (rules::present(ctx, todo, "Add TODO comment and address later")?) { rules::merge_edits(ctx, todo); }; }; let rule_8 = rules::rule { serial = 8, name = "time::chrono::invalid has been replaced with utciniterr", hooks = [nonterminal::IDENTIFIER], exec = &rule_8_exec, remember = -1, }; fn rule_8_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_8; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "chrono::invalid", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_8, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "chrono::utciniterr"); yield __eg; }; if (rules::present(ctx, replace, "Replace with chrono::utciniterr")?) { rules::merge_edits(ctx, replace); }; rules::warning(ctx, rule, "time::chrono::utciniterr provides additional error information your application may wish to examine."); }; let rule_9 = rules::rule { serial = 9, name = "time::date::peq has been renamed", hooks = [nonterminal::IDENTIFIER], exec = &rule_9_exec, remember = -1, }; fn rule_9_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_9; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "date::peq", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_9, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "date::equalspan"); yield __eg; }; if (rules::present(ctx, replace, "Replace with date::equalspan")?) { rules::merge_edits(ctx, replace); }; }; let rule_10 = rules::rule { serial = 10, name = "time::date::unit has been renamed", hooks = [nonterminal::IDENTIFIER], exec = &rule_10_exec, remember = -1, }; fn rule_10_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_10; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "date::unit", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_10, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "date::step"); yield __eg; }; if (rules::present(ctx, replace, "Replace with date::step")?) { rules::merge_edits(ctx, replace); }; }; let rule_11 = rules::rule { serial = 11, name = "time::date::unitdiff has been renamed", hooks = [nonterminal::IDENTIFIER], exec = &rule_11_exec, remember = -1, }; fn rule_11_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_11; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "date::unit", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_11, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "date::hop"); yield __eg; }; if (rules::present(ctx, replace, "Replace with date::hop")?) { rules::merge_edits(ctx, replace); }; }; let rule_12 = rules::rule { serial = 12, name = "time::date::period has been renamed", hooks = [nonterminal::IDENTIFIER], exec = &rule_12_exec, remember = -1, }; fn rule_12_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_12; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "date::period", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_12, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "date::span"); yield __eg; }; if (rules::present(ctx, replace, "Replace with date::span")?) { rules::merge_edits(ctx, replace); }; }; let rule_13 = rules::rule { serial = 13, name = "time::date::pdiff has been renamed", hooks = [nonterminal::IDENTIFIER], exec = &rule_13_exec, remember = -1, }; fn rule_13_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_13; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "date::pdiff", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const replace = { let __eg = &rules::editgroup { rule = &rule_13, ... }; rules::edit_replace(__eg, __captures.start, __captures.end, "date::traverse"); yield __eg; }; if (rules::present(ctx, replace, "Replace with date::traverse")?) { rules::merge_edits(ctx, replace); }; }; let rule_14 = rules::rule { serial = 14, name = "Some members time::chrono have been moved to time::date", hooks = [nonterminal::IDENTIFIER], exec = &rule_14_exec, remember = -1, }; fn rule_14_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_14; const start = lex::mkloc(lex); const id = parse::ident(lex)?; const end = lex::mkloc(lex); const key = (id, ""); const replacement = match (sort::search( time_chrono_moved, size((ast::ident, str)), &id, &ident_cmp)) { case let i: size => yield time_chrono_moved[i].1; case void => return; }; const replace = { let __eg = &rules::editgroup { rule = &rule_14, ... }; ensure_import(ctx, __eg, ["time", "date"]); rules::edit_replace(__eg, start, end, replacement); yield __eg; }; if (rules::present(ctx, replace, "Rename this symbol")?) { rules::merge_edits(ctx, replace); }; }; let rule_15 = rules::rule { serial = 15, name = "time::chrono::convert may return multiple values if the result is ambiguous", hooks = [nonterminal::CALL_EXPRESSION], exec = &rule_15_exec, remember = -1, }; fn rule_15_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_15; const lvalue = data: *ast::expr; if (!match_access(lvalue, "chrono::convert")) { return; }; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "${expr:\"balanced\"}${l:\"location\"}", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => yield; case => return; }; const end = lex::mkloc(lex); const edit = { let __eg = &rules::editgroup { rule = &rule_15, ... }; rules::edit_replace(__eg, getvar(&__captures, "l").start, end, "[0]"); yield __eg; }; if (rules::present(ctx, edit, "Use the first value returned")?) { rules::merge_edits(ctx, edit); }; }; let rule_16 = rules::rule { serial = 16, name = "time::chrono::analytical has been removed", hooks = [nonterminal::IDENTIFIER], exec = &rule_16_exec, remember = -1, }; fn rule_16_exec( lex: *lex::lexer, data: nullable *opaque, user: nullable *opaque, ) (void | parse::error) = { const ctx = rules::getcontext(user); const __location = lex::mkloc(lex); const rule = &rule_16; let __captures = rules::captures { ... }; if (!rules::match_pattern(ctx, &__captures, "chrono::analytical", )?) { return; }; defer rules::captures_finish(v0_25_2.glue, &__captures); const newline = rules::line_end(ctx, getvar(&__captures, "l").start); const todo = { let __eg = &rules::editgroup { rule = &rule_16, ... }; rules::edit_insert(__eg, newline, " // TODO: chrono::analytical has been removed"); yield __eg; }; if (rules::present(ctx, todo, "Add TODO comment and address later")?) { rules::merge_edits(ctx, todo); }; }; export const v0_25_2 = rules::version { name = "v0.25.2", glue = &v0_25_2::glue, down = "v0.24.2", rules = [ &rule_1, &rule_2, &rule_3, &rule_4, &rule_5, &rule_6, &rule_7, &rule_8, &rule_9, &rule_10, &rule_11, &rule_12, &rule_13, &rule_14, &rule_15, &rule_16, ], }; hare-update-0.25.2.0/versions/v0_25_2/v0.25.2.ha.in000066400000000000000000000147751503370650000207270ustar00rootroot00000000000000use fmt; use sort; use v0_25_2; use v0_25_2::ast; use v0_25_2::lex; use v0_25_2::parse; @version@("v0.25.2") { glue = &v0_25_2::glue, down = "v0.24.2", }; @rule@("errors::nomem has been removed") :: "identifier" { @match@ { errors::nomem }; const edit = @edit@ { @replace@($.start, $.end, "nomem"); }; @present@(edit, "Replace with nomem built-in"); }; @rule@("Allocations may return nomem errors") :: "allocation-expression", "append-expression", "insert-expression" { @match@ { alloc${:"balanced"}${l:"location"} }, { append${:"balanced"}${l:"location"} }, { insert${:"balanced"}${l:"location"} }; const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => return; // Already fixed by user case => void; }; const assertion = @edit@ { @insert@($l.start, "!"); }; const propagate = @edit@ { ensure_nomem(ctx, __eg); @insert@($l.start, "?"); }; @choice@ { case "Add an error assertion when out of memory" => assertion, case "Propagate the nomem error to the caller" => propagate, }; rules::warning(ctx, rule, `You should review your code for possible memory leaks when returning nomem and leaking objects allocated prior to the memory allocation failure.`); }; @rule@("Allocations may return nomem errors") :: "call-expression" { const lvalue = data: *ast::expr; const lvalue = match (lvalue.expr) { case let expr: ast::access_expr => yield expr; case => return; }; const lvalue = match (lvalue) { case let ident: ast::access_identifier => yield ident; case => return; }; if (sort::search( stdlib_nomem, size(ast::ident), &lvalue, &ident_cmp) is void) { return; }; @match@ { ${:"balanced"}${l:"location"} }; const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => return; // Already fixed by user case => void; }; const assertion = @edit@ { @insert@($l.start, "!"); }; const propagate = @edit@ { ensure_nomem(ctx, __eg); @insert@($l.start, "?"); }; @choice@ { case "Add an error assertion when out of memory" => assertion, case "Propagate the nomem error to the caller" => propagate, }; }; @rule@("time::unix has been deprecated") :: "call-expression" { const lvalue = data: *ast::expr; if (!match_access(lvalue, "time::unix")) { return; }; @match@ { (${obj:"expression"}) }; const edit = @edit@ { @replace@(lvalue.start, $.end, $obj.text); @append@($obj.end, ".sec"); }; @present@(edit, "Replace with time::instant.sec"); }; @rule@("time::from_unix has been deprecated") :: "identifier" { @match@ { time::from_unix }; const edit = @edit@ { @replace@($.start, $.end, "time::new"); }; @present@(edit, "Replace with time::new"); }; @rule@("time::date::simultaneous has been deprecated") :: "call-expression" { const lvalue = data: *ast::expr; if (!match_access(lvalue, "date::simultaneous")) { return; }; @match@ { (${:"expression"}, ${:"expression"}) }; const edit = @edit@ { ensure_import(ctx, __eg, ["time", "chrono"]); @replace@(lvalue.start, lvalue.end, "chrono::compare"); @insert@($.end, " == 0"); }; @present@(edit, "Replace with time::chrono::compare"); }; @rule@("The return value of time::chrono::fixedzone must be freed after use") :: "call-expression" { const lvalue = data: *ast::expr; if (!match_access(lvalue, "chrono::fixedzone")) { return; }; @match@ { ${:"balanced"}${l:"location"} }; const newline = rules::line_end(ctx, $l.start); const todo = @edit@ { @insert@(newline, " // TODO: Free with chrono::timezone_free"); }; @present@(todo, "Add TODO comment and address later"); }; @rule@("time::chrono::invalid has been replaced with utciniterr") :: "identifier" { @match@ { chrono::invalid }; const replace = @edit@ { @replace@($.start, $.end, "chrono::utciniterr"); }; @present@(replace, "Replace with chrono::utciniterr"); rules::warning(ctx, rule, "time::chrono::utciniterr provides additional error information your application may wish to examine."); }; @rule@("time::date::peq has been renamed") :: "identifier" { @match@ { date::peq }; const replace = @edit@ { @replace@($.start, $.end, "date::equalspan"); }; @present@(replace, "Replace with date::equalspan"); }; @rule@("time::date::unit has been renamed") :: "identifier" { @match@ { date::unit }; const replace = @edit@ { @replace@($.start, $.end, "date::step"); }; @present@(replace, "Replace with date::step"); }; @rule@("time::date::unitdiff has been renamed") :: "identifier" { @match@ { date::unit }; const replace = @edit@ { @replace@($.start, $.end, "date::hop"); }; @present@(replace, "Replace with date::hop"); }; @rule@("time::date::period has been renamed") :: "identifier" { @match@ { date::period }; const replace = @edit@ { @replace@($.start, $.end, "date::span"); }; @present@(replace, "Replace with date::span"); }; @rule@("time::date::pdiff has been renamed") :: "identifier" { @match@ { date::pdiff }; const replace = @edit@ { @replace@($.start, $.end, "date::traverse"); }; @present@(replace, "Replace with date::traverse"); }; @rule@("Some members time::chrono have been moved to time::date") :: "identifier" { const start = lex::mkloc(lex); const id = parse::ident(lex)?; const end = lex::mkloc(lex); const key = (id, ""); const replacement = match (sort::search( time_chrono_moved, size((ast::ident, str)), &id, &ident_cmp)) { case let i: size => yield time_chrono_moved[i].1; case void => return; }; const replace = @edit@ { ensure_import(ctx, __eg, ["time", "date"]); @replace@(start, end, replacement); }; @present@(replace, "Rename this symbol"); }; @rule@("time::chrono::convert may return multiple values if the result is ambiguous") :: "call-expression" { const lvalue = data: *ast::expr; if (!match_access(lvalue, "chrono::convert")) { return; }; @match@ { ${expr:"balanced"}${l:"location"} }; const tok = lex::lex(lex)?; switch (tok.0) { case ltok::LNOT, ltok::QUESTION => yield; case => // Already fixed by the user, or their code is too complex for // us to meaningfully propose an edit to (e.g. matching on and // handling the error). Let the build fail in the latter case. return; }; const end = lex::mkloc(lex); const edit = @edit@ { @replace@($l.start, end, "[0]"); }; @present@(edit, "Use the first value returned"); }; @rule@("time::chrono::analytical has been removed") :: "identifier" { @match@ { chrono::analytical }; const newline = rules::line_end(ctx, $l.start); const todo = @edit@ { @insert@(newline, " // TODO: chrono::analytical has been removed"); }; @present@(todo, "Add TODO comment and address later"); };