1 // Copyright 2012 The Rust Project Developers. See the COPYRIGHT
2 // file at the top-level directory of this distribution and at
3 // http://rust-lang.org/COPYRIGHT.
5 // Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
6 // http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
7 // <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
8 // option. This file may not be copied, modified, or distributed
9 // except according to those terms.
13 * # Compilation of match statements
15 * I will endeavor to explain the code as best I can. I have only a loose
16 * understanding of some parts of it.
20 * The basic state of the code is maintained in an array `m` of `@Match`
21 * objects. Each `@Match` describes some list of patterns, all of which must
22 * match against the current list of values. If those patterns match, then
23 * the arm listed in the match is the correct arm. A given arm may have
24 * multiple corresponding match entries, one for each alternative that
25 * remains. As we proceed these sets of matches are adjusted by the various
26 * `enter_XXX()` functions, each of which adjusts the set of options given
27 * some information about the value which has been matched.
29 * So, initially, there is one value and N matches, each of which have one
30 * constituent pattern. N here is usually the number of arms but may be
31 * greater, if some arms have multiple alternatives. For example, here:
33 * enum Foo { A, B(int), C(uint, uint) }
41 * The value would be `foo`. There would be four matches, each of which
42 * contains one pattern (and, in one case, a guard). We could collect the
43 * various options and then compile the code for the case where `foo` is an
44 * `A`, a `B`, and a `C`. When we generate the code for `C`, we would (1)
45 * drop the two matches that do not match a `C` and (2) expand the other two
46 * into two patterns each. In the first case, the two patterns would be `1u`
47 * and `2`, and the in the second case the _ pattern would be expanded into
48 * `_` and `_`. The two values are of course the arguments to `C`.
50 * Here is a quick guide to the various functions:
52 * - `compile_submatch()`: The main workhouse. It takes a list of values and
53 * a list of matches and finds the various possibilities that could occur.
55 * - `enter_XXX()`: modifies the list of matches based on some information
56 * about the value that has been matched. For example,
57 * `enter_rec_or_struct()` adjusts the values given that a record or struct
58 * has been matched. This is an infallible pattern, so *all* of the matches
59 * must be either wildcards or record/struct patterns. `enter_opt()`
60 * handles the fallible cases, and it is correspondingly more complex.
64 * We store information about the bound variables for each arm as part of the
65 * per-arm `ArmData` struct. There is a mapping from identifiers to
66 * `BindingInfo` structs. These structs contain the mode/id/type of the
67 * binding, but they also contain up to two LLVM values, called `llmatch` and
68 * `llbinding` respectively (the `llbinding`, as will be described shortly, is
69 * optional and only present for by-value bindings---therefore it is bundled
70 * up as part of the `TransBindingMode` type). Both point at allocas.
72 * The `llmatch` binding always stores a pointer into the value being matched
73 * which points at the data for the binding. If the value being matched has
74 * type `T`, then, `llmatch` will point at an alloca of type `T*` (and hence
75 * `llmatch` has type `T**`). So, if you have a pattern like:
79 * match (a, b) { (ref c, copy d) => { ... } }
81 * For `c` and `d`, we would generate allocas of type `C*` and `D*`
82 * respectively. These are called the `llmatch`. As we match, when we come
83 * up against an identifier, we store the current pointer into the
84 * corresponding alloca.
86 * In addition, for each by-value binding (copy or move), we will create a
87 * second alloca (`llbinding`) that will hold the final value. In this
88 * example, that means that `d` would have this second alloca of type `D` (and
89 * hence `llbinding` has type `D*`).
91 * Once a pattern is completely matched, and assuming that there is no guard
92 * pattern, we will branch to a block that leads to the body itself. For any
93 * by-value bindings, this block will first load the ptr from `llmatch` (the
94 * one of type `D*`) and copy/move the value into `llbinding` (the one of type
95 * `D`). The second alloca then becomes the value of the local variable. For
96 * by ref bindings, the value of the local variable is simply the first
99 * So, for the example above, we would generate a setup kind of like this:
105 * +-------------------------------------------+
106 * | llmatch_c = (addr of first half of tuple) |
107 * | llmatch_d = (addr of first half of tuple) |
108 * +-------------------------------------------+
110 * +--------------------------------------+
111 * | *llbinding_d = **llmatch_dlbinding_d |
112 * +--------------------------------------+
114 * If there is a guard, the situation is slightly different, because we must
115 * execute the guard code. Moreover, we need to do so once for each of the
116 * alternatives that lead to the arm, because if the guard fails, they may
117 * have different points from which to continue the search. Therefore, in that
118 * case, we generate code that looks more like:
124 * +-------------------------------------------+
125 * | llmatch_c = (addr of first half of tuple) |
126 * | llmatch_d = (addr of first half of tuple) |
127 * +-------------------------------------------+
129 * +-------------------------------------------------+
130 * | *llbinding_d = **llmatch_dlbinding_d |
131 * | check condition |
132 * | if false { free *llbinding_d, goto next case } |
133 * | if true { goto body } |
134 * +-------------------------------------------------+
136 * The handling for the cleanups is a bit... sensitive. Basically, the body
137 * is the one that invokes `add_clean()` for each binding. During the guard
138 * evaluation, we add temporary cleanups and revoke them after the guard is
139 * evaluated (it could fail, after all). Presuming the guard fails, we drop
140 * the various values we copied explicitly. Note that guards and moves are
141 * just plain incompatible.
143 * Some relevant helper functions that manage bindings:
144 * - `create_bindings_map()`
145 * - `store_non_ref_bindings()`
146 * - `insert_lllocals()`
152 use lib
::llvm
::{llvm, ValueRef, BasicBlockRef}
;
153 use middle
::const_eval
;
154 use middle
::borrowck
::root_map_key
;
155 use middle
::pat_util
::*;
156 use middle
::resolve
::DefMap
;
157 use middle
::trans
::adt
;
158 use middle
::trans
::base
::*;
159 use middle
::trans
::build
::*;
160 use middle
::trans
::callee
;
161 use middle
::trans
::common
::*;
162 use middle
::trans
::consts
;
163 use middle
::trans
::controlflow
;
164 use middle
::trans
::datum
;
165 use middle
::trans
::datum
::*;
166 use middle
::trans
::expr
::Dest
;
167 use middle
::trans
::expr
;
168 use middle
::trans
::glue
;
169 use middle
::trans
::tvec
;
170 use middle
::trans
::type_of
;
172 use util
::common
::indenter
;
174 use std
::hashmap
::HashMap
;
177 use syntax
::ast
::ident
;
178 use syntax
::ast_util
::path_to_ident
;
179 use syntax
::ast_util
;
180 use syntax
::codemap
::{span, dummy_sp}
;
181 use syntax
::print
::pprust
::pat_to_str
;
183 // An option identifying a literal: either a unit-like struct or an
186 UnitLikeStructLit(ast
::node_id
), // the node ID of the pattern
188 ConstLit(ast
::def_id
), // the def ID of the constant
191 // An option identifying a branch (either a literal, a enum variant or a
195 var(/* disr val */int
, @adt
::Repr
),
196 range(@ast
::expr
, @ast
::expr
),
198 vec_len_ge(uint
, /* slice */uint
)
201 pub fn opt_eq(tcx
: ty
::ctxt
, a
: &Opt
, b
: &Opt
) -> bool
{
203 (&lit(a
), &lit(b
)) => {
205 (UnitLikeStructLit(a
), UnitLikeStructLit(b
)) => a
== b
,
209 ExprLit(existing_a_expr
) => a_expr
= existing_a_expr
,
210 ConstLit(a_const
) => {
211 let e
= const_eval
::lookup_const_by_id(tcx
, a_const
);
214 UnitLikeStructLit(_
) => {
215 fail
!("UnitLikeStructLit should have been handled \
222 ExprLit(existing_b_expr
) => b_expr
= existing_b_expr
,
223 ConstLit(b_const
) => {
224 let e
= const_eval
::lookup_const_by_id(tcx
, b_const
);
227 UnitLikeStructLit(_
) => {
228 fail
!("UnitLikeStructLit should have been handled \
233 match const_eval
::compare_lit_exprs(tcx
, a_expr
, b_expr
) {
234 Some(val1
) => val1
== 0,
235 None
=> fail
!("compare_list_exprs: type mismatch"),
240 (&range(a1
, a2
), &range(b1
, b2
)) => {
241 let m1
= const_eval
::compare_lit_exprs(tcx
, a1
, b1
);
242 let m2
= const_eval
::compare_lit_exprs(tcx
, a2
, b2
);
244 (Some(val1
), Some(val2
)) => (val1
== 0 && val2
== 0),
245 _
=> fail
!("compare_list_exprs: type mismatch"),
248 (&var(a
, _
), &var(b
, _
)) => a
== b
,
249 (&vec_len_eq(a
), &vec_len_eq(b
)) => a
== b
,
250 (&vec_len_ge(a
, _
), &vec_len_ge(b
, _
)) => a
== b
,
255 pub enum opt_result
{
256 single_result(Result
),
258 range_result(Result
, Result
),
260 pub fn trans_opt(bcx
: block
, o
: &Opt
) -> opt_result
{
261 let _icx
= push_ctxt("match::trans_opt");
265 lit(ExprLit(lit_expr
)) => {
266 let datumblock
= expr
::trans_to_datum(bcx
, lit_expr
);
267 return single_result(datumblock
.to_result());
269 lit(UnitLikeStructLit(pat_id
)) => {
270 let struct_ty
= ty
::node_id_to_type(bcx
.tcx(), pat_id
);
271 let datumblock
= datum
::scratch_datum(bcx
, struct_ty
, true);
272 return single_result(datumblock
.to_result(bcx
));
274 lit(ConstLit(lit_id
)) => {
275 let llval
= consts
::get_const_val(bcx
.ccx(), lit_id
);
276 return single_result(rslt(bcx
, llval
));
278 var(disr_val
, repr
) => {
279 return adt
::trans_case(bcx
, repr
, disr_val
);
282 return range_result(rslt(bcx
, consts
::const_expr(ccx
, l1
)),
283 rslt(bcx
, consts
::const_expr(ccx
, l2
)));
286 return single_result(rslt(bcx
, C_int(ccx
, n
as int
)));
288 vec_len_ge(n
, _
) => {
289 return lower_bound(rslt(bcx
, C_int(ccx
, n
as int
)));
294 pub fn variant_opt(bcx
: block
, pat_id
: ast
::node_id
)
297 match ccx
.tcx
.def_map
.get_copy(&pat_id
) {
298 ast
::def_variant(enum_id
, var_id
) => {
299 let variants
= ty
::enum_variants(ccx
.tcx
, enum_id
);
300 for (*variants
).iter().advance
|v
| {
302 return var(v
.disr_val
,
303 adt
::represent_node(bcx
, pat_id
))
306 ::std
::util
::unreachable();
309 ast
::def_struct(_
) => {
310 return lit(UnitLikeStructLit(pat_id
));
313 ccx
.sess
.bug("non-variant or struct in variant_opt()");
318 pub enum TransBindingMode
{
319 TrByValue(/*ismove:*/ bool
, /*llbinding:*/ ValueRef
),
324 * Information about a pattern binding:
325 * - `llmatch` is a pointer to a stack slot. The stack slot contains a
326 * pointer into the value being matched. Hence, llmatch has type `T**`
327 * where `T` is the value being matched.
328 * - `trmode` is the trans binding mode
329 * - `id` is the node id of the binding
330 * - `ty` is the Rust type of the binding */
331 pub struct BindingInfo
{
333 trmode
: TransBindingMode
,
338 pub type BindingsMap
= HashMap
<ident
, BindingInfo
>;
340 pub struct ArmData
<'
self> {
342 arm
: &'
self ast
::arm
,
343 bindings_map
: BindingsMap
346 pub struct Match
<'
self> {
348 data
: @ArmData
<'
self>
351 pub fn match_to_str(bcx
: block
, m
: &Match
) -> ~str {
352 if bcx
.sess().verbose() {
353 // for many programs, this just take too long to serialize
354 fmt
!("%?", m
.pats
.map(|p
| pat_to_str(*p
, bcx
.sess().intr())))
356 fmt
!("%u pats", m
.pats
.len())
360 pub fn matches_to_str(bcx
: block
, m
: &[@Match
]) -> ~str {
361 fmt
!("%?", m
.map(|n
| match_to_str(bcx
, *n
)))
364 pub fn has_nested_bindings(m
: &[@Match
], col
: uint
) -> bool
{
365 for m
.iter().advance
|br
| {
366 match br
.pats
[col
].node
{
367 ast
::pat_ident(_
, _
, Some(_
)) => return true,
374 pub fn expand_nested_bindings
<'r
>(bcx
: block
,
379 debug
!("expand_nested_bindings(bcx=%s, m=%s, col=%u, val=%?)",
381 matches_to_str(bcx
, m
),
383 bcx
.val_to_str(val
));
384 let _indenter
= indenter();
387 match br
.pats
[col
].node
{
388 ast
::pat_ident(_
, path
, Some(inner
)) => {
389 let pats
= vec
::append(
390 br
.pats
.slice(0u, col
).to_owned(),
391 vec
::append(~[inner
],
392 br
.pats
.slice(col
+ 1u,
396 br
.data
.bindings_map
.get(&path_to_ident(path
));
398 Store(bcx
, val
, binding_info
.llmatch
);
399 @Match {pats: pats, data: br.data}
408 pub fn assert_is_binding_or_wild(bcx
: block
, p
: @ast
::pat
) {
409 if !pat_is_binding_or_wild(bcx
.tcx().def_map
, p
) {
412 fmt
!("Expected an identifier pattern but found p: %s",
413 pat_to_str(p
, bcx
.sess().intr())));
417 pub type enter_pat
<'
self> = &'
self fn(@ast
::pat
) -> Option
<~[@ast
::pat
]>;
419 pub fn enter_match
<'r
>(bcx
: block
,
426 debug
!("enter_match(bcx=%s, m=%s, col=%u, val=%?)",
428 matches_to_str(bcx
, m
),
430 bcx
.val_to_str(val
));
431 let _indenter
= indenter();
433 let mut result
= ~[];
434 for m
.iter().advance
|br
| {
435 match e(br
.pats
[col
]) {
439 vec
::append(sub
, br
.pats
.slice(0u, col
)),
440 br
.pats
.slice(col
+ 1u, br
.pats
.len()));
442 let this
= br
.pats
[col
];
444 ast
::pat_ident(_
, path
, None
) => {
445 if pat_is_binding(dm
, this
) {
447 br
.data
.bindings_map
.get(
448 &path_to_ident(path
));
449 Store(bcx
, val
, binding_info
.llmatch
);
455 result
.push(@Match {pats: pats, data: br.data}
);
461 debug
!("result=%s", matches_to_str(bcx
, result
));
466 pub fn enter_default
<'r
>(bcx
: block
,
472 debug
!("enter_default(bcx=%s, m=%s, col=%u, val=%?)",
474 matches_to_str(bcx
, m
),
476 bcx
.val_to_str(val
));
477 let _indenter
= indenter();
479 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
481 ast
::pat_wild
| ast
::pat_tup(_
) | ast
::pat_struct(*) => Some(~[]),
482 ast
::pat_ident(_
, _
, None
) if pat_is_binding(dm
, p
) => Some(~[]),
488 // <pcwalton> nmatsakis: what does enter_opt do?
489 // <pcwalton> in trans/match
490 // <pcwalton> trans/match.rs is like stumbling around in a dark cave
491 // <nmatsakis> pcwalton: the enter family of functions adjust the set of
492 // patterns as needed
493 // <nmatsakis> yeah, at some point I kind of achieved some level of
495 // <nmatsakis> anyhow, they adjust the patterns given that something of that
496 // kind has been found
497 // <nmatsakis> pcwalton: ok, right, so enter_XXX() adjusts the patterns, as I
499 // <nmatsakis> enter_match() kind of embodies the generic code
500 // <nmatsakis> it is provided with a function that tests each pattern to see
501 // if it might possibly apply and so forth
502 // <nmatsakis> so, if you have a pattern like {a: _, b: _, _} and one like _
503 // <nmatsakis> then _ would be expanded to (_, _)
504 // <nmatsakis> one spot for each of the sub-patterns
505 // <nmatsakis> enter_opt() is one of the more complex; it covers the fallible
507 // <nmatsakis> enter_rec_or_struct() or enter_tuple() are simpler, since they
508 // are infallible patterns
509 // <nmatsakis> so all patterns must either be records (resp. tuples) or
512 pub fn enter_opt
<'r
>(bcx
: block
,
519 debug
!("enter_opt(bcx=%s, m=%s, col=%u, val=%?)",
521 matches_to_str(bcx
, m
),
523 bcx
.val_to_str(val
));
524 let _indenter
= indenter();
527 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
528 do enter_match(bcx
, tcx
.def_map
, m
, col
, val
) |p
| {
531 ast
::pat_ident(_
, _
, None
) if pat_is_const(tcx
.def_map
, p
) => {
532 let const_def
= tcx
.def_map
.get_copy(&p
.id
);
533 let const_def_id
= ast_util
::def_id_of_def(const_def
);
534 if opt_eq(tcx
, &lit(ConstLit(const_def_id
)), opt
) {
540 ast
::pat_enum(_
, ref subpats
) => {
541 if opt_eq(tcx
, &variant_opt(bcx
, p
.id
), opt
) {
543 None
=> Some(vec
::from_elem(variant_size
, dummy
)),
550 ast
::pat_ident(_
, _
, None
)
551 if pat_is_variant_or_struct(tcx
.def_map
, p
) => {
552 if opt_eq(tcx
, &variant_opt(bcx
, p
.id
), opt
) {
559 if opt_eq(tcx
, &lit(ExprLit(l
)), opt
) {Some(~[])}
else {None}
561 ast
::pat_range(l1
, l2
) => {
562 if opt_eq(tcx
, &range(l1
, l2
), opt
) {Some(~[])}
else {None}
564 ast
::pat_struct(_
, ref field_pats
, _
) => {
565 if opt_eq(tcx
, &variant_opt(bcx
, p
.id
), opt
) {
566 // Look up the struct variant ID.
568 match tcx
.def_map
.get_copy(&p
.id
) {
569 ast
::def_variant(_
, found_struct_id
) => {
570 struct_id
= found_struct_id
;
573 tcx
.sess
.span_bug(p
.span
, "expected enum variant def");
577 // Reorder the patterns into the same order they were
578 // specified in the struct definition. Also fill in
579 // unspecified fields with dummy.
580 let mut reordered_patterns
= ~[];
581 let r
= ty
::lookup_struct_fields(tcx
, struct_id
);
582 for r
.iter().advance
|field
| {
583 match field_pats
.iter().find_(|p
| p
.ident
== field
.ident
) {
584 None
=> reordered_patterns
.push(dummy
),
585 Some(fp
) => reordered_patterns
.push(fp
.pat
)
588 Some(reordered_patterns
)
593 ast
::pat_vec(ref before
, slice
, ref after
) => {
596 let n
= before
.len() + after
.len();
597 let i
= before
.len();
598 if opt_eq(tcx
, &vec_len_ge(n
, i
), opt
) {
599 Some(vec
::append_one(copy
*before
, slice
) +
606 let n
= before
.len();
607 if opt_eq(tcx
, &vec_len_eq(n
), opt
) {
616 assert_is_binding_or_wild(bcx
, p
);
617 Some(vec
::from_elem(variant_size
, dummy
))
623 pub fn enter_rec_or_struct
<'r
>(bcx
: block
,
627 fields
: &[ast
::ident
],
630 debug
!("enter_rec_or_struct(bcx=%s, m=%s, col=%u, val=%?)",
632 matches_to_str(bcx
, m
),
634 bcx
.val_to_str(val
));
635 let _indenter
= indenter();
637 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
638 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
640 ast
::pat_struct(_
, ref fpats
, _
) => {
642 for fields
.iter().advance
|fname
| {
643 match fpats
.iter().find_(|p
| p
.ident
== *fname
) {
644 None
=> pats
.push(dummy
),
645 Some(pat
) => pats
.push(pat
.pat
)
651 assert_is_binding_or_wild(bcx
, p
);
652 Some(vec
::from_elem(fields
.len(), dummy
))
658 pub fn enter_tup
<'r
>(bcx
: block
,
665 debug
!("enter_tup(bcx=%s, m=%s, col=%u, val=%?)",
667 matches_to_str(bcx
, m
),
669 bcx
.val_to_str(val
));
670 let _indenter
= indenter();
672 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
673 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
675 ast
::pat_tup(ref elts
) => {
679 assert_is_binding_or_wild(bcx
, p
);
680 Some(vec
::from_elem(n_elts
, dummy
))
686 pub fn enter_tuple_struct
<'r
>(bcx
: block
,
693 debug
!("enter_tuple_struct(bcx=%s, m=%s, col=%u, val=%?)",
695 matches_to_str(bcx
, m
),
697 bcx
.val_to_str(val
));
698 let _indenter
= indenter();
700 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
701 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
703 ast
::pat_enum(_
, Some(ref elts
)) => Some(copy
*elts
),
705 assert_is_binding_or_wild(bcx
, p
);
706 Some(vec
::from_elem(n_elts
, dummy
))
712 pub fn enter_box
<'r
>(bcx
: block
,
718 debug
!("enter_box(bcx=%s, m=%s, col=%u, val=%?)",
720 matches_to_str(bcx
, m
),
722 bcx
.val_to_str(val
));
723 let _indenter
= indenter();
725 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
726 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
728 ast
::pat_box(sub
) => {
732 assert_is_binding_or_wild(bcx
, p
);
739 pub fn enter_uniq
<'r
>(bcx
: block
,
745 debug
!("enter_uniq(bcx=%s, m=%s, col=%u, val=%?)",
747 matches_to_str(bcx
, m
),
749 bcx
.val_to_str(val
));
750 let _indenter
= indenter();
752 let dummy
= @ast
::pat {id: 0, node: ast::pat_wild, span: dummy_sp()}
;
753 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
755 ast
::pat_uniq(sub
) => {
759 assert_is_binding_or_wild(bcx
, p
);
766 pub fn enter_region
<'r
>(bcx
: block
,
772 debug
!("enter_region(bcx=%s, m=%s, col=%u, val=%?)",
774 matches_to_str(bcx
, m
),
776 bcx
.val_to_str(val
));
777 let _indenter
= indenter();
779 let dummy
= @ast
::pat { id: 0, node: ast::pat_wild, span: dummy_sp() }
;
780 do enter_match(bcx
, dm
, m
, col
, val
) |p
| {
782 ast
::pat_region(sub
) => {
786 assert_is_binding_or_wild(bcx
, p
);
793 // Returns the options in one column of matches. An option is something that
794 // needs to be conditionally matched at runtime; for example, the discriminant
795 // on a set of enum variants or a literal.
796 pub fn get_options(bcx
: block
, m
: &[@Match
], col
: uint
) -> ~[Opt
] {
798 fn add_to_set(tcx
: ty
::ctxt
, set
: &mut ~[Opt
], val
: Opt
) {
799 if set
.iter().any_(|l
| opt_eq(tcx
, l
, &val
)) {return;}
804 for m
.iter().advance
|br
| {
805 let cur
= br
.pats
[col
];
808 add_to_set(ccx
.tcx
, &mut found
, lit(ExprLit(l
)));
810 ast
::pat_ident(*) => {
811 // This is one of: an enum variant, a unit-like struct, or a
813 match ccx
.tcx
.def_map
.find(&cur
.id
) {
814 Some(&ast
::def_variant(*)) => {
815 add_to_set(ccx
.tcx
, &mut found
,
816 variant_opt(bcx
, cur
.id
));
818 Some(&ast
::def_struct(*)) => {
819 add_to_set(ccx
.tcx
, &mut found
,
820 lit(UnitLikeStructLit(cur
.id
)));
822 Some(&ast
::def_static(const_did
, false)) => {
823 add_to_set(ccx
.tcx
, &mut found
,
824 lit(ConstLit(const_did
)));
829 ast
::pat_enum(*) | ast
::pat_struct(*) => {
830 // This could be one of: a tuple-like enum variant, a
831 // struct-like enum variant, or a struct.
832 match ccx
.tcx
.def_map
.find(&cur
.id
) {
833 Some(&ast
::def_fn(*)) |
834 Some(&ast
::def_variant(*)) => {
835 add_to_set(ccx
.tcx
, &mut found
,
836 variant_opt(bcx
, cur
.id
));
838 Some(&ast
::def_static(const_did
, false)) => {
839 add_to_set(ccx
.tcx
, &mut found
,
840 lit(ConstLit(const_did
)));
845 ast
::pat_range(l1
, l2
) => {
846 add_to_set(ccx
.tcx
, &mut found
, range(l1
, l2
));
848 ast
::pat_vec(ref before
, slice
, ref after
) => {
849 let opt
= match slice
{
850 None
=> vec_len_eq(before
.len()),
851 Some(_
) => vec_len_ge(before
.len() + after
.len(),
854 add_to_set(ccx
.tcx
, &mut found
, opt
);
862 pub struct ExtractedBlock
{
867 pub fn extract_variant_args(bcx
: block
,
872 let _icx
= push_ctxt("match::extract_variant_args");
873 let args
= do vec
::from_fn(adt
::num_args(repr
, disr_val
)) |i
| {
874 adt
::trans_field_ptr(bcx
, repr
, val
, disr_val
, i
)
877 ExtractedBlock { vals: args, bcx: bcx }
880 fn match_datum(bcx
: block
, val
: ValueRef
, pat_id
: ast
::node_id
) -> Datum
{
881 //! Helper for converting from the ValueRef that we pass around in
882 //! the match code, which is always by ref, into a Datum. Eventually
883 //! we should just pass around a Datum and be done with it.
885 let ty
= node_id_type(bcx
, pat_id
);
886 Datum {val: val, ty: ty, mode: datum::ByRef(RevokeClean)}
890 pub fn extract_vec_elems(bcx
: block
,
892 pat_id
: ast
::node_id
,
898 let _icx
= push_ctxt("match::extract_vec_elems");
899 let vec_datum
= match_datum(bcx
, val
, pat_id
);
900 let (bcx
, base
, len
) = vec_datum
.get_vec_base_and_len(bcx
, pat_span
,
902 let vt
= tvec
::vec_types(bcx
, node_id_type(bcx
, pat_id
));
904 let mut elems
= do vec
::from_fn(elem_count
) |i
| {
906 None
=> GEPi(bcx
, base
, [i
]),
907 Some(n
) if i
< n
=> GEPi(bcx
, base
, [i
]),
908 Some(n
) if i
> n
=> {
909 InBoundsGEP(bcx
, base
, [
911 C_int(bcx
.ccx(), (elem_count
- i
) as int
))])
913 _
=> unsafe { llvm::LLVMGetUndef(vt.llunit_ty.to_ref()) }
918 let slice_offset
= Mul(bcx
, vt
.llunit_size
,
919 C_int(bcx
.ccx(), n
as int
)
921 let slice_begin
= tvec
::pointer_add(bcx
, base
, slice_offset
);
922 let slice_len_offset
= Mul(bcx
, vt
.llunit_size
,
923 C_int(bcx
.ccx(), (elem_count
- 1u) as int
)
925 let slice_len
= Sub(bcx
, len
, slice_len_offset
);
926 let slice_ty
= ty
::mk_evec(bcx
.tcx(),
927 ty
::mt {ty: vt.unit_ty, mutbl: ast::m_imm}
,
928 ty
::vstore_slice(ty
::re_static
)
930 let scratch
= scratch_datum(bcx
, slice_ty
, false);
931 Store(bcx
, slice_begin
,
932 GEPi(bcx
, scratch
.val
, [0u, abi
::slice_elt_base
])
934 Store(bcx
, slice_len
,
935 GEPi(bcx
, scratch
.val
, [0u, abi
::slice_elt_len
])
937 elems
[n
] = scratch
.val
;
938 scratch
.add_clean(bcx
);
941 ExtractedBlock { vals: elems, bcx: bcx }
944 // NB: This function does not collect fields from struct-like enum variants.
945 pub fn collect_record_or_struct_fields(bcx
: block
,
949 let mut fields
: ~[ast
::ident
] = ~[];
950 for m
.iter().advance
|br
| {
951 match br
.pats
[col
].node
{
952 ast
::pat_struct(_
, ref fs
, _
) => {
953 match ty
::get(node_id_type(bcx
, br
.pats
[col
].id
)).sty
{
954 ty
::ty_struct(*) => extend(&mut fields
, *fs
),
963 fn extend(idents
: &mut ~[ast
::ident
], field_pats
: &[ast
::field_pat
]) {
964 for field_pats
.iter().advance
|field_pat
| {
965 let field_ident
= field_pat
.ident
;
966 if !idents
.iter().any_(|x
| *x
== field_ident
) {
967 idents
.push(field_ident
);
973 pub fn pats_require_rooting(bcx
: block
,
977 do m
.iter().any_
|br
| {
978 let pat_id
= br
.pats
[col
].id
;
979 let key
= root_map_key {id: pat_id, derefs: 0u }
;
980 bcx
.ccx().maps
.root_map
.contains_key(&key
)
984 pub fn root_pats_as_necessary(mut bcx
: block
,
989 for m
.iter().advance
|br
| {
990 let pat_id
= br
.pats
[col
].id
;
992 let datum
= Datum
{val
: val
, ty
: node_id_type(bcx
, pat_id
),
993 mode
: ByRef(ZeroMem
)};
994 bcx
= datum
.root_and_write_guard(bcx
, br
.pats
[col
].span
, pat_id
, 0);
1000 // Macro for deciding whether any of the remaining matches fit a given kind of
1001 // pattern. Note that, because the macro is well-typed, either ALL of the
1002 // matches should fit that sort of pattern or NONE (however, some of the
1003 // matches may be wildcards like _ or identifiers).
1004 macro_rules
! any_pat (
1005 ($m
:expr
, $pattern
:pat
) => (
1006 do ($m
).iter().any_
|br
| {
1007 match br
.pats
[col
].node
{
1015 pub fn any_box_pat(m
: &[@Match
], col
: uint
) -> bool
{
1016 any_pat
!(m
, ast
::pat_box(_
))
1019 pub fn any_uniq_pat(m
: &[@Match
], col
: uint
) -> bool
{
1020 any_pat
!(m
, ast
::pat_uniq(_
))
1023 pub fn any_region_pat(m
: &[@Match
], col
: uint
) -> bool
{
1024 any_pat
!(m
, ast
::pat_region(_
))
1027 pub fn any_tup_pat(m
: &[@Match
], col
: uint
) -> bool
{
1028 any_pat
!(m
, ast
::pat_tup(_
))
1031 pub fn any_tuple_struct_pat(bcx
: block
, m
: &[@Match
], col
: uint
) -> bool
{
1032 do m
.iter().any_
|br
| {
1033 let pat
= br
.pats
[col
];
1035 ast
::pat_enum(_
, Some(_
)) => {
1036 match bcx
.tcx().def_map
.find(&pat
.id
) {
1037 Some(&ast
::def_fn(*)) |
1038 Some(&ast
::def_struct(*)) => true,
1047 pub type mk_fail
= @
fn() -> BasicBlockRef
;
1049 pub fn pick_col(m
: &[@Match
]) -> uint
{
1050 fn score(p
: &ast
::pat
) -> uint
{
1052 ast
::pat_lit(_
) | ast
::pat_enum(_
, _
) | ast
::pat_range(_
, _
) => 1u,
1053 ast
::pat_ident(_
, _
, Some(p
)) => score(p
),
1057 let mut scores
= vec
::from_elem(m
[0].pats
.len(), 0u);
1058 for m
.iter().advance
|br
| {
1060 for br
.pats
.iter().advance
|p
| { scores[i] += score(*p); i += 1u; }
1062 let mut max_score
= 0u;
1063 let mut best_col
= 0u;
1065 for scores
.iter().advance
|score
| {
1068 // Irrefutable columns always go first, they'd only be duplicated in
1070 if score
== 0u { return i; }
1071 // If no irrefutable ones are found, we pick the one with the biggest
1072 // branching factor.
1073 if score
> max_score { max_score = score; best_col = i; }
1080 pub enum branch_kind { no_branch, single, switch, compare, compare_vec_len, }
1082 // Compiles a comparison between two things.
1084 // NB: This must produce an i1, not a Rust bool (i8).
1085 pub fn compare_values(cx
: block
,
1090 let _icx
= push_ctxt("compare_values");
1091 if ty
::type_is_scalar(rhs_t
) {
1092 let rs
= compare_scalar_types(cx
, lhs
, rhs
, rhs_t
, ast
::eq
);
1093 return rslt(rs
.bcx
, rs
.val
);
1096 match ty
::get(rhs_t
).sty
{
1097 ty
::ty_estr(ty
::vstore_uniq
) => {
1098 let scratch_result
= scratch_datum(cx
, ty
::mk_bool(), false);
1099 let scratch_lhs
= alloca(cx
, val_ty(lhs
));
1100 Store(cx
, lhs
, scratch_lhs
);
1101 let scratch_rhs
= alloca(cx
, val_ty(rhs
));
1102 Store(cx
, rhs
, scratch_rhs
);
1103 let did
= cx
.tcx().lang_items
.uniq_str_eq_fn();
1104 let bcx
= callee
::trans_lang_call(cx
, did
, [scratch_lhs
, scratch_rhs
],
1105 expr
::SaveIn(scratch_result
.val
));
1106 let result
= scratch_result
.to_result(bcx
);
1109 val
: bool_to_i1(result
.bcx
, result
.val
)
1113 let scratch_result
= scratch_datum(cx
, ty
::mk_bool(), false);
1114 let did
= cx
.tcx().lang_items
.str_eq_fn();
1115 let bcx
= callee
::trans_lang_call(cx
, did
, [lhs
, rhs
],
1116 expr
::SaveIn(scratch_result
.val
));
1117 let result
= scratch_result
.to_result(bcx
);
1120 val
: bool_to_i1(result
.bcx
, result
.val
)
1124 cx
.tcx().sess
.bug("only scalars and strings supported in \
1130 fn store_non_ref_bindings(bcx
: block
,
1131 bindings_map
: &BindingsMap
,
1132 mut opt_temp_cleanups
: Option
<&mut ~[ValueRef
]>)
1137 * For each copy/move binding, copy the value from the value
1138 * being matched into its final home. This code executes once
1139 * one of the patterns for a given arm has completely matched.
1140 * It adds temporary cleanups to the `temp_cleanups` array,
1141 * if one is provided.
1145 for bindings_map
.each_value
|&binding_info
| {
1146 match binding_info
.trmode
{
1147 TrByValue(is_move
, lldest
) => {
1148 let llval
= Load(bcx
, binding_info
.llmatch
); // get a T*
1149 let datum
= Datum
{val
: llval
, ty
: binding_info
.ty
,
1150 mode
: ByRef(ZeroMem
)};
1153 datum
.move_to(bcx
, INIT
, lldest
)
1155 datum
.copy_to(bcx
, INIT
, lldest
)
1159 do opt_temp_cleanups
.mutate
|temp_cleanups
| {
1160 add_clean_temp_mem(bcx
, lldest
, binding_info
.ty
);
1161 temp_cleanups
.push(lldest
);
1171 fn insert_lllocals(bcx
: block
,
1172 bindings_map
: &BindingsMap
,
1173 binding_mode
: IrrefutablePatternBindingMode
,
1174 add_cleans
: bool
) -> block
{
1176 * For each binding in `data.bindings_map`, adds an appropriate entry into
1177 * the `fcx.lllocals` map. If add_cleans is true, then adds cleanups for
1181 let llmap
= match binding_mode
{
1182 BindLocal
=> bcx
.fcx
.lllocals
,
1183 BindArgument
=> bcx
.fcx
.llargs
1186 for bindings_map
.each_value
|&binding_info
| {
1187 let llval
= match binding_info
.trmode
{
1188 // By value bindings: use the stack slot that we
1189 // copied/moved the value into
1190 TrByValue(_
, lldest
) => {
1192 add_clean(bcx
, lldest
, binding_info
.ty
);
1198 // By ref binding: use the ptr into the matched value
1200 binding_info
.llmatch
1204 debug
!("binding %? to %s", binding_info
.id
, bcx
.val_to_str(llval
));
1205 llmap
.insert(binding_info
.id
, llval
);
1210 pub fn compile_guard(bcx
: block
,
1211 guard_expr
: @ast
::expr
,
1215 chk
: Option
<mk_fail
>)
1217 debug
!("compile_guard(bcx=%s, guard_expr=%s, m=%s, vals=%?)",
1219 bcx
.expr_to_str(guard_expr
),
1220 matches_to_str(bcx
, m
),
1221 vals
.map(|v
| bcx
.val_to_str(*v
)));
1222 let _indenter
= indenter();
1225 let mut temp_cleanups
= ~[];
1226 bcx
= store_non_ref_bindings(bcx
, &data
.bindings_map
, Some(&mut temp_cleanups
));
1227 bcx
= insert_lllocals(bcx
, &data
.bindings_map
, BindLocal
, false);
1229 let val
= unpack_result
!(bcx
, {
1230 do with_scope_result(bcx
, guard_expr
.info(),
1232 expr
::trans_to_datum(bcx
, guard_expr
).to_result()
1235 let val
= bool_to_i1(bcx
, val
);
1237 // Revoke the temp cleanups now that the guard successfully executed.
1238 for temp_cleanups
.iter().advance
|llval
| {
1239 revoke_clean(bcx
, *llval
);
1242 return do with_cond(bcx
, Not(bcx
, val
)) |bcx
| {
1243 // Guard does not match: free the values we copied,
1244 // and remove all bindings from the lllocals table
1245 let bcx
= drop_bindings(bcx
, data
);
1246 compile_submatch(bcx
, m
, vals
, chk
);
1250 fn drop_bindings(bcx
: block
, data
: &ArmData
) -> block
{
1252 for data
.bindings_map
.each_value
|&binding_info
| {
1253 match binding_info
.trmode
{
1254 TrByValue(_
, llval
) => {
1255 bcx
= glue
::drop_ty(bcx
, llval
, binding_info
.ty
);
1259 bcx
.fcx
.lllocals
.remove(&binding_info
.id
);
1265 pub fn compile_submatch(bcx
: block
,
1268 chk
: Option
<mk_fail
>) {
1269 debug
!("compile_submatch(bcx=%s, m=%s, vals=%?)",
1271 matches_to_str(bcx
, m
),
1272 vals
.map(|v
| bcx
.val_to_str(*v
)));
1273 let _indenter
= indenter();
1276 For an empty match, a fall-through case must exist
1278 assert
!((m
.len() > 0u || chk
.is_some()));
1279 let _icx
= push_ctxt("match::compile_submatch");
1281 let tcx
= bcx
.tcx();
1282 let dm
= tcx
.def_map
;
1284 Br(bcx
, chk
.get()());
1287 if m
[0].pats
.len() == 0u {
1288 let data
= m
[0].data
;
1289 match data
.arm
.guard
{
1290 Some(guard_expr
) => {
1291 bcx
= compile_guard(bcx
, guard_expr
, m
[0].data
,
1292 m
.slice(1, m
.len()),
1297 Br(bcx
, data
.bodycx
.llbb
);
1301 let col
= pick_col(m
);
1302 let val
= vals
[col
];
1304 if has_nested_bindings(m
, col
) {
1305 expand_nested_bindings(bcx
, m
, col
, val
)
1311 let vals_left
= vec
::append(vals
.slice(0u, col
).to_owned(),
1312 vals
.slice(col
+ 1u, vals
.len()));
1313 let ccx
= bcx
.fcx
.ccx
;
1315 let mut pat_span
= dummy_sp();
1316 for m
.iter().advance
|br
| {
1317 // Find a real id (we're adding placeholder wildcard patterns, but
1318 // each column is guaranteed to have at least one real pattern)
1320 pat_id
= br
.pats
[col
].id
;
1321 pat_span
= br
.pats
[col
].span
;
1325 // If we are not matching against an `@T`, we should not be
1326 // required to root any values.
1327 assert
!(any_box_pat(m
, col
) || !pats_require_rooting(bcx
, m
, col
));
1329 let rec_fields
= collect_record_or_struct_fields(bcx
, m
, col
);
1330 if rec_fields
.len() > 0 {
1331 let pat_ty
= node_id_type(bcx
, pat_id
);
1332 let pat_repr
= adt
::represent_type(bcx
.ccx(), pat_ty
);
1333 do expr
::with_field_tys(tcx
, pat_ty
, None
) |discr
, field_tys
| {
1334 let rec_vals
= rec_fields
.map(|field_name
| {
1335 let ix
= ty
::field_idx_strict(tcx
, *field_name
, field_tys
);
1336 adt
::trans_field_ptr(bcx
, pat_repr
, val
, discr
, ix
)
1340 enter_rec_or_struct(bcx
, dm
, m
, col
, rec_fields
, val
),
1341 vec
::append(rec_vals
, vals_left
),
1347 if any_tup_pat(m
, col
) {
1348 let tup_ty
= node_id_type(bcx
, pat_id
);
1349 let tup_repr
= adt
::represent_type(bcx
.ccx(), tup_ty
);
1350 let n_tup_elts
= match ty
::get(tup_ty
).sty
{
1351 ty
::ty_tup(ref elts
) => elts
.len(),
1352 _
=> ccx
.sess
.bug("non-tuple type in tuple pattern")
1354 let tup_vals
= do vec
::from_fn(n_tup_elts
) |i
| {
1355 adt
::trans_field_ptr(bcx
, tup_repr
, val
, 0, i
)
1357 compile_submatch(bcx
, enter_tup(bcx
, dm
, m
, col
, val
, n_tup_elts
),
1358 vec
::append(tup_vals
, vals_left
), chk
);
1362 if any_tuple_struct_pat(bcx
, m
, col
) {
1363 let struct_ty
= node_id_type(bcx
, pat_id
);
1364 let struct_element_count
;
1365 match ty
::get(struct_ty
).sty
{
1366 ty
::ty_struct(struct_id
, _
) => {
1367 struct_element_count
=
1368 ty
::lookup_struct_fields(tcx
, struct_id
).len();
1371 ccx
.sess
.bug("non-struct type in tuple struct pattern");
1375 let struct_repr
= adt
::represent_type(bcx
.ccx(), struct_ty
);
1376 let llstructvals
= do vec
::from_fn(struct_element_count
) |i
| {
1377 adt
::trans_field_ptr(bcx
, struct_repr
, val
, 0, i
)
1379 compile_submatch(bcx
,
1380 enter_tuple_struct(bcx
, dm
, m
, col
, val
,
1381 struct_element_count
),
1382 vec
::append(llstructvals
, vals_left
),
1387 // Unbox in case of a box field
1388 if any_box_pat(m
, col
) {
1389 bcx
= root_pats_as_necessary(bcx
, m
, col
, val
);
1390 let llbox
= Load(bcx
, val
);
1391 let unboxed
= GEPi(bcx
, llbox
, [0u, abi
::box_field_body
]);
1392 compile_submatch(bcx
, enter_box(bcx
, dm
, m
, col
, val
),
1393 vec
::append(~[unboxed
], vals_left
), chk
);
1397 if any_uniq_pat(m
, col
) {
1398 let llbox
= Load(bcx
, val
);
1399 let unboxed
= GEPi(bcx
, llbox
, [0u, abi
::box_field_body
]);
1400 compile_submatch(bcx
, enter_uniq(bcx
, dm
, m
, col
, val
),
1401 vec
::append(~[unboxed
], vals_left
), chk
);
1405 if any_region_pat(m
, col
) {
1406 let loaded_val
= Load(bcx
, val
);
1407 compile_submatch(bcx
, enter_region(bcx
, dm
, m
, col
, val
),
1408 vec
::append(~[loaded_val
], vals_left
), chk
);
1412 // Decide what kind of branch we need
1413 let opts
= get_options(bcx
, m
, col
);
1414 let mut kind
= no_branch
;
1415 let mut test_val
= val
;
1416 if opts
.len() > 0u {
1419 let (the_kind
, val_opt
) = adt
::trans_switch(bcx
, repr
, val
);
1421 for val_opt
.iter().advance
|&tval
| { test_val = tval; }
1424 let pty
= node_id_type(bcx
, pat_id
);
1425 test_val
= load_if_immediate(bcx
, val
, pty
);
1426 kind
= if ty
::type_is_integral(pty
) { switch }
1430 test_val
= Load(bcx
, val
);
1433 vec_len_eq(*) | vec_len_ge(*) => {
1434 let vt
= tvec
::vec_types(bcx
, node_id_type(bcx
, pat_id
));
1435 let unboxed
= load_if_immediate(bcx
, val
, vt
.vec_ty
);
1436 let (_
, len
) = tvec
::get_base_and_len(
1437 bcx
, unboxed
, vt
.vec_ty
1439 test_val
= SDiv(bcx
, len
, vt
.llunit_size
);
1440 kind
= compare_vec_len
;
1444 for opts
.iter().advance
|o
| {
1446 range(_
, _
) => { kind = compare; break }
1450 let else_cx
= match kind
{
1451 no_branch
| single
=> bcx
,
1452 _
=> sub_block(bcx
, "match_else")
1454 let sw
= if kind
== switch
{
1455 Switch(bcx
, test_val
, else_cx
.llbb
, opts
.len())
1457 C_int(ccx
, 0) // Placeholder for when not using a switch
1460 let defaults
= enter_default(else_cx
, dm
, m
, col
, val
);
1461 let exhaustive
= chk
.is_none() && defaults
.len() == 0u;
1462 let len
= opts
.len();
1465 // Compile subtrees for each option
1466 for opts
.iter().advance
|opt
| {
1468 let mut opt_cx
= else_cx
;
1469 if !exhaustive
|| i
< len
{
1470 opt_cx
= sub_block(bcx
, "match_case");
1472 single
=> Br(bcx
, opt_cx
.llbb
),
1474 match trans_opt(bcx
, opt
) {
1475 single_result(r
) => {
1477 llvm
::LLVMAddCase(sw
, r
.val
, opt_cx
.llbb
);
1483 "in compile_submatch, expected \
1484 trans_opt to return a single_result")
1489 let t
= node_id_type(bcx
, pat_id
);
1490 let Result {bcx: after_cx, val: matches}
= {
1491 do with_scope_result(bcx
, None
,
1492 "compare_scope") |bcx
| {
1493 match trans_opt(bcx
, opt
) {
1495 Result {bcx, val}
) => {
1496 compare_values(bcx
, test_val
, val
, t
)
1499 Result {bcx, val}
) => {
1500 compare_scalar_types(
1505 Result {val: vbegin, _}
,
1506 Result {bcx, val: vend}
) => {
1507 let Result {bcx, val: llge}
=
1508 compare_scalar_types(
1510 vbegin
, t
, ast
::ge
);
1511 let Result {bcx, val: llle}
=
1512 compare_scalar_types(
1513 bcx
, test_val
, vend
,
1515 rslt(bcx
, And(bcx
, llge
, llle
))
1520 bcx
= sub_block(after_cx
, "compare_next");
1521 CondBr(after_cx
, matches
, opt_cx
.llbb
, bcx
.llbb
);
1523 compare_vec_len
=> {
1524 let Result {bcx: after_cx, val: matches}
= {
1525 do with_scope_result(bcx
, None
,
1526 "compare_vec_len_scope") |bcx
| {
1527 match trans_opt(bcx
, opt
) {
1529 Result {bcx, val}
) => {
1530 let value
= compare_scalar_values(
1532 signed_int
, ast
::eq
);
1536 Result {bcx, val: val}
) => {
1537 let value
= compare_scalar_values(
1539 signed_int
, ast
::ge
);
1543 Result {val: vbegin, _}
,
1544 Result {bcx, val: vend}
) => {
1546 compare_scalar_values(
1548 vbegin
, signed_int
, ast
::ge
);
1550 compare_scalar_values(
1551 bcx
, test_val
, vend
,
1552 signed_int
, ast
::le
);
1553 rslt(bcx
, And(bcx
, llge
, llle
))
1558 bcx
= sub_block(after_cx
, "compare_vec_len_next");
1559 CondBr(after_cx
, matches
, opt_cx
.llbb
, bcx
.llbb
);
1563 } else if kind
== compare
|| kind
== compare_vec_len
{
1564 Br(bcx
, else_cx
.llbb
);
1568 let mut unpacked
= ~[];
1570 var(disr_val
, repr
) => {
1571 let ExtractedBlock {vals: argvals, bcx: new_bcx}
=
1572 extract_variant_args(opt_cx
, repr
, disr_val
, val
);
1573 size
= argvals
.len();
1577 vec_len_eq(n
) | vec_len_ge(n
, _
) => {
1578 let n
= match *opt
{
1579 vec_len_ge(*) => n
+ 1u,
1582 let slice
= match *opt
{
1583 vec_len_ge(_
, i
) => Some(i
),
1586 let args
= extract_vec_elems(opt_cx
, pat_span
, pat_id
, n
, slice
,
1588 size
= args
.vals
.len();
1589 unpacked
= /*bad*/copy args
.vals
;
1592 lit(_
) | range(_
, _
) => ()
1594 let opt_ms
= enter_opt(opt_cx
, m
, opt
, col
, size
, val
);
1595 let opt_vals
= vec
::append(unpacked
, vals_left
);
1596 compile_submatch(opt_cx
, opt_ms
, opt_vals
, chk
);
1599 // Compile the fall-through case, if any
1601 if kind
== compare
|| kind
== compare_vec_len
{
1602 Br(bcx
, else_cx
.llbb
);
1605 compile_submatch(else_cx
, defaults
, vals_left
, chk
);
1610 pub fn trans_match(bcx
: block
,
1611 match_expr
: &ast
::expr
,
1612 discr_expr
: @ast
::expr
,
1614 dest
: Dest
) -> block
{
1615 let _icx
= push_ctxt("match::trans_match");
1616 do with_scope(bcx
, match_expr
.info(), "match") |bcx
| {
1617 trans_match_inner(bcx
, discr_expr
, arms
, dest
)
1621 fn create_bindings_map(bcx
: block
, pat
: @ast
::pat
) -> BindingsMap
{
1622 // Create the bindings map, which is a mapping from each binding name
1623 // to an alloca() that will be the value for that local variable.
1624 // Note that we use the names because each binding will have many ids
1625 // from the various alternatives.
1626 let ccx
= bcx
.ccx();
1627 let tcx
= bcx
.tcx();
1628 let mut bindings_map
= HashMap
::new();
1629 do pat_bindings(tcx
.def_map
, pat
) |bm
, p_id
, _s
, path
| {
1630 let ident
= path_to_ident(path
);
1631 let variable_ty
= node_id_type(bcx
, p_id
);
1632 let llvariable_ty
= type_of
::type_of(ccx
, variable_ty
);
1637 ast
::bind_infer
=> {
1638 // in this case, the final type of the variable will be T,
1639 // but during matching we need to store a *T as explained
1641 let is_move
= ccx
.maps
.moves_map
.contains(&p_id
);
1642 llmatch
= alloca(bcx
, llvariable_ty
.ptr_to());
1643 trmode
= TrByValue(is_move
, alloca(bcx
, llvariable_ty
));
1645 ast
::bind_by_ref(_
) => {
1646 llmatch
= alloca(bcx
, llvariable_ty
);
1650 bindings_map
.insert(ident
, BindingInfo
{
1651 llmatch
: llmatch
, trmode
: trmode
,
1652 id
: p_id
, ty
: variable_ty
1655 return bindings_map
;
1658 pub fn trans_match_inner(scope_cx
: block
,
1659 discr_expr
: @ast
::expr
,
1661 dest
: Dest
) -> block
{
1662 let _icx
= push_ctxt("match::trans_match_inner");
1663 let mut bcx
= scope_cx
;
1664 let tcx
= bcx
.tcx();
1666 let discr_datum
= unpack_datum
!(bcx
, {
1667 expr
::trans_to_datum(bcx
, discr_expr
)
1669 if bcx
.unreachable
{
1673 let mut arm_datas
= ~[];
1674 let mut matches
= ~[];
1675 for arms
.iter().advance
|arm
| {
1676 let body
= scope_block(bcx
, arm
.body
.info(), "case_body");
1677 let bindings_map
= create_bindings_map(bcx
, arm
.pats
[0]);
1678 let arm_data
= @ArmData
{bodycx
: body
,
1680 bindings_map
: bindings_map
};
1681 arm_datas
.push(arm_data
);
1682 for arm
.pats
.iter().advance
|p
| {
1683 matches
.push(@Match {pats: ~[*p], data: arm_data}
);
1687 let t
= node_id_type(bcx
, discr_expr
.id
);
1689 if ty
::type_is_empty(tcx
, t
) {
1690 // Special case for empty types
1691 let fail_cx
= @
mut None
;
1692 let f
: mk_fail
= || mk_fail(scope_cx
, discr_expr
.span
,
1693 @
"scrutinizing value that can't exist", fail_cx
);
1699 let lldiscr
= discr_datum
.to_zeroable_ref_llval(bcx
);
1700 compile_submatch(bcx
, matches
, [lldiscr
], chk
);
1702 let mut arm_cxs
= ~[];
1703 for arm_datas
.iter().advance
|arm_data
| {
1704 let mut bcx
= arm_data
.bodycx
;
1706 // If this arm has a guard, then the various by-value bindings have
1707 // already been copied into their homes. If not, we do it here. This
1708 // is just to reduce code space. See extensive comment at the start
1709 // of the file for more details.
1710 if arm_data
.arm
.guard
.is_none() {
1711 bcx
= store_non_ref_bindings(bcx
, &arm_data
.bindings_map
, None
);
1714 // insert bindings into the lllocals map and add cleanups
1715 bcx
= insert_lllocals(bcx
, &arm_data
.bindings_map
, BindLocal
, true);
1717 bcx
= controlflow
::trans_block(bcx
, &arm_data
.arm
.body
, dest
);
1718 bcx
= trans_block_cleanups(bcx
, block_cleanups(arm_data
.bodycx
));
1722 bcx
= controlflow
::join_blocks(scope_cx
, arm_cxs
);
1725 fn mk_fail(bcx
: block
, sp
: span
, msg
: @
str,
1726 finished
: @
mut Option
<BasicBlockRef
>) -> BasicBlockRef
{
1727 match *finished { Some(bb) => return bb, _ => () }
1728 let fail_cx
= sub_block(bcx
, "case_fallthrough");
1729 controlflow
::trans_fail(fail_cx
, Some(sp
), msg
);
1730 *finished
= Some(fail_cx
.llbb
);
1731 return fail_cx
.llbb
;
1735 pub enum IrrefutablePatternBindingMode
{
1736 // Stores the association between node ID and LLVM value in `lllocals`.
1738 // Stores the association between node ID and LLVM value in `llargs`.
1742 // Not match-related, but similar to the pattern-munging code above
1743 pub fn bind_irrefutable_pat(bcx
: block
,
1747 binding_mode
: IrrefutablePatternBindingMode
)
1749 let _icx
= push_ctxt("match::bind_irrefutable_pat");
1750 let ccx
= bcx
.fcx
.ccx
;
1753 // Necessary since bind_irrefutable_pat is called outside trans_match
1755 ast
::pat_ident(_
, _
, ref inner
) => {
1756 if pat_is_variant_or_struct(bcx
.tcx().def_map
, pat
) {
1761 let binding_ty
= node_id_type(bcx
, pat
.id
);
1762 let datum
= Datum
{val
: val
, ty
: binding_ty
,
1763 mode
: ByRef(RevokeClean
)};
1764 let scratch
= scratch_datum(bcx
, binding_ty
, false);
1765 datum
.copy_to_datum(bcx
, INIT
, scratch
);
1766 match binding_mode
{
1768 bcx
.fcx
.lllocals
.insert(pat
.id
, scratch
.val
);
1771 bcx
.fcx
.llargs
.insert(pat
.id
, scratch
.val
);
1774 add_clean(bcx
, scratch
.val
, binding_ty
);
1776 match binding_mode
{
1778 bcx
.fcx
.lllocals
.insert(pat
.id
, val
);
1781 bcx
.fcx
.llargs
.insert(pat
.id
, val
);
1786 for inner
.iter().advance
|inner_pat
| {
1787 bcx
= bind_irrefutable_pat(
1788 bcx
, *inner_pat
, val
, true, binding_mode
);
1791 ast
::pat_enum(_
, ref sub_pats
) => {
1792 match bcx
.tcx().def_map
.find(&pat
.id
) {
1793 Some(&ast
::def_variant(enum_id
, var_id
)) => {
1794 let repr
= adt
::represent_node(bcx
, pat
.id
);
1795 let vinfo
= ty
::enum_variant_with_id(ccx
.tcx
,
1798 let args
= extract_variant_args(bcx
,
1802 for sub_pats
.iter().advance
|sub_pat
| {
1803 for args
.vals
.iter().enumerate().advance
|(i
, argval
)| {
1804 bcx
= bind_irrefutable_pat(bcx
,
1812 Some(&ast
::def_fn(*)) |
1813 Some(&ast
::def_struct(*)) => {
1816 // This is a unit-like struct. Nothing to do here.
1818 Some(ref elems
) => {
1819 // This is the tuple struct case.
1820 let repr
= adt
::represent_node(bcx
, pat
.id
);
1821 for elems
.iter().enumerate().advance
|(i
, elem
)| {
1822 let fldptr
= adt
::trans_field_ptr(bcx
, repr
,
1824 bcx
= bind_irrefutable_pat(bcx
,
1833 Some(&ast
::def_static(_
, false)) => {
1834 bcx
= bind_irrefutable_pat(bcx
, pat
, val
, make_copy
,
1838 // Nothing to do here.
1842 ast
::pat_struct(_
, ref fields
, _
) => {
1843 let tcx
= bcx
.tcx();
1844 let pat_ty
= node_id_type(bcx
, pat
.id
);
1845 let pat_repr
= adt
::represent_type(bcx
.ccx(), pat_ty
);
1846 do expr
::with_field_tys(tcx
, pat_ty
, None
) |discr
, field_tys
| {
1847 for fields
.iter().advance
|f
| {
1848 let ix
= ty
::field_idx_strict(tcx
, f
.ident
, field_tys
);
1849 let fldptr
= adt
::trans_field_ptr(bcx
, pat_repr
, val
,
1851 bcx
= bind_irrefutable_pat(bcx
,
1859 ast
::pat_tup(ref elems
) => {
1860 let repr
= adt
::represent_node(bcx
, pat
.id
);
1861 for elems
.iter().enumerate().advance
|(i
, elem
)| {
1862 let fldptr
= adt
::trans_field_ptr(bcx
, repr
, val
, 0, i
);
1863 bcx
= bind_irrefutable_pat(bcx
,
1870 ast
::pat_box(inner
) | ast
::pat_uniq(inner
) => {
1871 let llbox
= Load(bcx
, val
);
1872 let unboxed
= GEPi(bcx
, llbox
, [0u, abi
::box_field_body
]);
1873 bcx
= bind_irrefutable_pat(bcx
,
1879 ast
::pat_region(inner
) => {
1880 let loaded_val
= Load(bcx
, val
);
1881 bcx
= bind_irrefutable_pat(bcx
,
1887 ast
::pat_wild
| ast
::pat_lit(_
) | ast
::pat_range(_
, _
) |
1888 ast
::pat_vec(*) => ()