]> git.proxmox.com Git - rustc.git/blob - src/librustc/middle/trans/_match.rs
Imported Upstream version 0.7
[rustc.git] / src / librustc / middle / trans / _match.rs
1 // Copyright 2012 The Rust Project Developers. See the COPYRIGHT
2 // file at the top-level directory of this distribution and at
3 // http://rust-lang.org/COPYRIGHT.
4 //
5 // Licensed under the Apache License, Version 2.0 <LICENSE-APACHE or
6 // http://www.apache.org/licenses/LICENSE-2.0> or the MIT license
7 // <LICENSE-MIT or http://opensource.org/licenses/MIT>, at your
8 // option. This file may not be copied, modified, or distributed
9 // except according to those terms.
10
11 /*!
12 *
13 * # Compilation of match statements
14 *
15 * I will endeavor to explain the code as best I can. I have only a loose
16 * understanding of some parts of it.
17 *
18 * ## Matching
19 *
20 * The basic state of the code is maintained in an array `m` of `@Match`
21 * objects. Each `@Match` describes some list of patterns, all of which must
22 * match against the current list of values. If those patterns match, then
23 * the arm listed in the match is the correct arm. A given arm may have
24 * multiple corresponding match entries, one for each alternative that
25 * remains. As we proceed these sets of matches are adjusted by the various
26 * `enter_XXX()` functions, each of which adjusts the set of options given
27 * some information about the value which has been matched.
28 *
29 * So, initially, there is one value and N matches, each of which have one
30 * constituent pattern. N here is usually the number of arms but may be
31 * greater, if some arms have multiple alternatives. For example, here:
32 *
33 * enum Foo { A, B(int), C(uint, uint) }
34 * match foo {
35 * A => ...,
36 * B(x) => ...,
37 * C(1u, 2) => ...,
38 * C(_) => ...
39 * }
40 *
41 * The value would be `foo`. There would be four matches, each of which
42 * contains one pattern (and, in one case, a guard). We could collect the
43 * various options and then compile the code for the case where `foo` is an
44 * `A`, a `B`, and a `C`. When we generate the code for `C`, we would (1)
45 * drop the two matches that do not match a `C` and (2) expand the other two
46 * into two patterns each. In the first case, the two patterns would be `1u`
47 * and `2`, and the in the second case the _ pattern would be expanded into
48 * `_` and `_`. The two values are of course the arguments to `C`.
49 *
50 * Here is a quick guide to the various functions:
51 *
52 * - `compile_submatch()`: The main workhouse. It takes a list of values and
53 * a list of matches and finds the various possibilities that could occur.
54 *
55 * - `enter_XXX()`: modifies the list of matches based on some information
56 * about the value that has been matched. For example,
57 * `enter_rec_or_struct()` adjusts the values given that a record or struct
58 * has been matched. This is an infallible pattern, so *all* of the matches
59 * must be either wildcards or record/struct patterns. `enter_opt()`
60 * handles the fallible cases, and it is correspondingly more complex.
61 *
62 * ## Bindings
63 *
64 * We store information about the bound variables for each arm as part of the
65 * per-arm `ArmData` struct. There is a mapping from identifiers to
66 * `BindingInfo` structs. These structs contain the mode/id/type of the
67 * binding, but they also contain up to two LLVM values, called `llmatch` and
68 * `llbinding` respectively (the `llbinding`, as will be described shortly, is
69 * optional and only present for by-value bindings---therefore it is bundled
70 * up as part of the `TransBindingMode` type). Both point at allocas.
71 *
72 * The `llmatch` binding always stores a pointer into the value being matched
73 * which points at the data for the binding. If the value being matched has
74 * type `T`, then, `llmatch` will point at an alloca of type `T*` (and hence
75 * `llmatch` has type `T**`). So, if you have a pattern like:
76 *
77 * let a: A = ...;
78 * let b: B = ...;
79 * match (a, b) { (ref c, copy d) => { ... } }
80 *
81 * For `c` and `d`, we would generate allocas of type `C*` and `D*`
82 * respectively. These are called the `llmatch`. As we match, when we come
83 * up against an identifier, we store the current pointer into the
84 * corresponding alloca.
85 *
86 * In addition, for each by-value binding (copy or move), we will create a
87 * second alloca (`llbinding`) that will hold the final value. In this
88 * example, that means that `d` would have this second alloca of type `D` (and
89 * hence `llbinding` has type `D*`).
90 *
91 * Once a pattern is completely matched, and assuming that there is no guard
92 * pattern, we will branch to a block that leads to the body itself. For any
93 * by-value bindings, this block will first load the ptr from `llmatch` (the
94 * one of type `D*`) and copy/move the value into `llbinding` (the one of type
95 * `D`). The second alloca then becomes the value of the local variable. For
96 * by ref bindings, the value of the local variable is simply the first
97 * alloca.
98 *
99 * So, for the example above, we would generate a setup kind of like this:
100 *
101 * +-------+
102 * | Entry |
103 * +-------+
104 * |
105 * +-------------------------------------------+
106 * | llmatch_c = (addr of first half of tuple) |
107 * | llmatch_d = (addr of first half of tuple) |
108 * +-------------------------------------------+
109 * |
110 * +--------------------------------------+
111 * | *llbinding_d = **llmatch_dlbinding_d |
112 * +--------------------------------------+
113 *
114 * If there is a guard, the situation is slightly different, because we must
115 * execute the guard code. Moreover, we need to do so once for each of the
116 * alternatives that lead to the arm, because if the guard fails, they may
117 * have different points from which to continue the search. Therefore, in that
118 * case, we generate code that looks more like:
119 *
120 * +-------+
121 * | Entry |
122 * +-------+
123 * |
124 * +-------------------------------------------+
125 * | llmatch_c = (addr of first half of tuple) |
126 * | llmatch_d = (addr of first half of tuple) |
127 * +-------------------------------------------+
128 * |
129 * +-------------------------------------------------+
130 * | *llbinding_d = **llmatch_dlbinding_d |
131 * | check condition |
132 * | if false { free *llbinding_d, goto next case } |
133 * | if true { goto body } |
134 * +-------------------------------------------------+
135 *
136 * The handling for the cleanups is a bit... sensitive. Basically, the body
137 * is the one that invokes `add_clean()` for each binding. During the guard
138 * evaluation, we add temporary cleanups and revoke them after the guard is
139 * evaluated (it could fail, after all). Presuming the guard fails, we drop
140 * the various values we copied explicitly. Note that guards and moves are
141 * just plain incompatible.
142 *
143 * Some relevant helper functions that manage bindings:
144 * - `create_bindings_map()`
145 * - `store_non_ref_bindings()`
146 * - `insert_lllocals()`
147 *
148 */
149
150
151 use back::abi;
152 use lib::llvm::{llvm, ValueRef, BasicBlockRef};
153 use middle::const_eval;
154 use middle::borrowck::root_map_key;
155 use middle::pat_util::*;
156 use middle::resolve::DefMap;
157 use middle::trans::adt;
158 use middle::trans::base::*;
159 use middle::trans::build::*;
160 use middle::trans::callee;
161 use middle::trans::common::*;
162 use middle::trans::consts;
163 use middle::trans::controlflow;
164 use middle::trans::datum;
165 use middle::trans::datum::*;
166 use middle::trans::expr::Dest;
167 use middle::trans::expr;
168 use middle::trans::glue;
169 use middle::trans::tvec;
170 use middle::trans::type_of;
171 use middle::ty;
172 use util::common::indenter;
173
174 use std::hashmap::HashMap;
175 use std::vec;
176 use syntax::ast;
177 use syntax::ast::ident;
178 use syntax::ast_util::path_to_ident;
179 use syntax::ast_util;
180 use syntax::codemap::{span, dummy_sp};
181 use syntax::print::pprust::pat_to_str;
182
183 // An option identifying a literal: either a unit-like struct or an
184 // expression.
185 pub enum Lit {
186 UnitLikeStructLit(ast::node_id), // the node ID of the pattern
187 ExprLit(@ast::expr),
188 ConstLit(ast::def_id), // the def ID of the constant
189 }
190
191 // An option identifying a branch (either a literal, a enum variant or a
192 // range)
193 pub enum Opt {
194 lit(Lit),
195 var(/* disr val */int, @adt::Repr),
196 range(@ast::expr, @ast::expr),
197 vec_len_eq(uint),
198 vec_len_ge(uint, /* slice */uint)
199 }
200
201 pub fn opt_eq(tcx: ty::ctxt, a: &Opt, b: &Opt) -> bool {
202 match (a, b) {
203 (&lit(a), &lit(b)) => {
204 match (a, b) {
205 (UnitLikeStructLit(a), UnitLikeStructLit(b)) => a == b,
206 _ => {
207 let a_expr;
208 match a {
209 ExprLit(existing_a_expr) => a_expr = existing_a_expr,
210 ConstLit(a_const) => {
211 let e = const_eval::lookup_const_by_id(tcx, a_const);
212 a_expr = e.get();
213 }
214 UnitLikeStructLit(_) => {
215 fail!("UnitLikeStructLit should have been handled \
216 above")
217 }
218 }
219
220 let b_expr;
221 match b {
222 ExprLit(existing_b_expr) => b_expr = existing_b_expr,
223 ConstLit(b_const) => {
224 let e = const_eval::lookup_const_by_id(tcx, b_const);
225 b_expr = e.get();
226 }
227 UnitLikeStructLit(_) => {
228 fail!("UnitLikeStructLit should have been handled \
229 above")
230 }
231 }
232
233 match const_eval::compare_lit_exprs(tcx, a_expr, b_expr) {
234 Some(val1) => val1 == 0,
235 None => fail!("compare_list_exprs: type mismatch"),
236 }
237 }
238 }
239 }
240 (&range(a1, a2), &range(b1, b2)) => {
241 let m1 = const_eval::compare_lit_exprs(tcx, a1, b1);
242 let m2 = const_eval::compare_lit_exprs(tcx, a2, b2);
243 match (m1, m2) {
244 (Some(val1), Some(val2)) => (val1 == 0 && val2 == 0),
245 _ => fail!("compare_list_exprs: type mismatch"),
246 }
247 }
248 (&var(a, _), &var(b, _)) => a == b,
249 (&vec_len_eq(a), &vec_len_eq(b)) => a == b,
250 (&vec_len_ge(a, _), &vec_len_ge(b, _)) => a == b,
251 _ => false
252 }
253 }
254
255 pub enum opt_result {
256 single_result(Result),
257 lower_bound(Result),
258 range_result(Result, Result),
259 }
260 pub fn trans_opt(bcx: block, o: &Opt) -> opt_result {
261 let _icx = push_ctxt("match::trans_opt");
262 let ccx = bcx.ccx();
263 let bcx = bcx;
264 match *o {
265 lit(ExprLit(lit_expr)) => {
266 let datumblock = expr::trans_to_datum(bcx, lit_expr);
267 return single_result(datumblock.to_result());
268 }
269 lit(UnitLikeStructLit(pat_id)) => {
270 let struct_ty = ty::node_id_to_type(bcx.tcx(), pat_id);
271 let datumblock = datum::scratch_datum(bcx, struct_ty, true);
272 return single_result(datumblock.to_result(bcx));
273 }
274 lit(ConstLit(lit_id)) => {
275 let llval = consts::get_const_val(bcx.ccx(), lit_id);
276 return single_result(rslt(bcx, llval));
277 }
278 var(disr_val, repr) => {
279 return adt::trans_case(bcx, repr, disr_val);
280 }
281 range(l1, l2) => {
282 return range_result(rslt(bcx, consts::const_expr(ccx, l1)),
283 rslt(bcx, consts::const_expr(ccx, l2)));
284 }
285 vec_len_eq(n) => {
286 return single_result(rslt(bcx, C_int(ccx, n as int)));
287 }
288 vec_len_ge(n, _) => {
289 return lower_bound(rslt(bcx, C_int(ccx, n as int)));
290 }
291 }
292 }
293
294 pub fn variant_opt(bcx: block, pat_id: ast::node_id)
295 -> Opt {
296 let ccx = bcx.ccx();
297 match ccx.tcx.def_map.get_copy(&pat_id) {
298 ast::def_variant(enum_id, var_id) => {
299 let variants = ty::enum_variants(ccx.tcx, enum_id);
300 for (*variants).iter().advance |v| {
301 if var_id == v.id {
302 return var(v.disr_val,
303 adt::represent_node(bcx, pat_id))
304 }
305 }
306 ::std::util::unreachable();
307 }
308 ast::def_fn(*) |
309 ast::def_struct(_) => {
310 return lit(UnitLikeStructLit(pat_id));
311 }
312 _ => {
313 ccx.sess.bug("non-variant or struct in variant_opt()");
314 }
315 }
316 }
317
318 pub enum TransBindingMode {
319 TrByValue(/*ismove:*/ bool, /*llbinding:*/ ValueRef),
320 TrByRef,
321 }
322
323 /**
324 * Information about a pattern binding:
325 * - `llmatch` is a pointer to a stack slot. The stack slot contains a
326 * pointer into the value being matched. Hence, llmatch has type `T**`
327 * where `T` is the value being matched.
328 * - `trmode` is the trans binding mode
329 * - `id` is the node id of the binding
330 * - `ty` is the Rust type of the binding */
331 pub struct BindingInfo {
332 llmatch: ValueRef,
333 trmode: TransBindingMode,
334 id: ast::node_id,
335 ty: ty::t,
336 }
337
338 pub type BindingsMap = HashMap<ident, BindingInfo>;
339
340 pub struct ArmData<'self> {
341 bodycx: block,
342 arm: &'self ast::arm,
343 bindings_map: BindingsMap
344 }
345
346 pub struct Match<'self> {
347 pats: ~[@ast::pat],
348 data: @ArmData<'self>
349 }
350
351 pub fn match_to_str(bcx: block, m: &Match) -> ~str {
352 if bcx.sess().verbose() {
353 // for many programs, this just take too long to serialize
354 fmt!("%?", m.pats.map(|p| pat_to_str(*p, bcx.sess().intr())))
355 } else {
356 fmt!("%u pats", m.pats.len())
357 }
358 }
359
360 pub fn matches_to_str(bcx: block, m: &[@Match]) -> ~str {
361 fmt!("%?", m.map(|n| match_to_str(bcx, *n)))
362 }
363
364 pub fn has_nested_bindings(m: &[@Match], col: uint) -> bool {
365 for m.iter().advance |br| {
366 match br.pats[col].node {
367 ast::pat_ident(_, _, Some(_)) => return true,
368 _ => ()
369 }
370 }
371 return false;
372 }
373
374 pub fn expand_nested_bindings<'r>(bcx: block,
375 m: &[@Match<'r>],
376 col: uint,
377 val: ValueRef)
378 -> ~[@Match<'r>] {
379 debug!("expand_nested_bindings(bcx=%s, m=%s, col=%u, val=%?)",
380 bcx.to_str(),
381 matches_to_str(bcx, m),
382 col,
383 bcx.val_to_str(val));
384 let _indenter = indenter();
385
386 do m.map |br| {
387 match br.pats[col].node {
388 ast::pat_ident(_, path, Some(inner)) => {
389 let pats = vec::append(
390 br.pats.slice(0u, col).to_owned(),
391 vec::append(~[inner],
392 br.pats.slice(col + 1u,
393 br.pats.len())));
394
395 let binding_info =
396 br.data.bindings_map.get(&path_to_ident(path));
397
398 Store(bcx, val, binding_info.llmatch);
399 @Match {pats: pats, data: br.data}
400 }
401 _ => {
402 *br
403 }
404 }
405 }
406 }
407
408 pub fn assert_is_binding_or_wild(bcx: block, p: @ast::pat) {
409 if !pat_is_binding_or_wild(bcx.tcx().def_map, p) {
410 bcx.sess().span_bug(
411 p.span,
412 fmt!("Expected an identifier pattern but found p: %s",
413 pat_to_str(p, bcx.sess().intr())));
414 }
415 }
416
417 pub type enter_pat<'self> = &'self fn(@ast::pat) -> Option<~[@ast::pat]>;
418
419 pub fn enter_match<'r>(bcx: block,
420 dm: DefMap,
421 m: &[@Match<'r>],
422 col: uint,
423 val: ValueRef,
424 e: enter_pat)
425 -> ~[@Match<'r>] {
426 debug!("enter_match(bcx=%s, m=%s, col=%u, val=%?)",
427 bcx.to_str(),
428 matches_to_str(bcx, m),
429 col,
430 bcx.val_to_str(val));
431 let _indenter = indenter();
432
433 let mut result = ~[];
434 for m.iter().advance |br| {
435 match e(br.pats[col]) {
436 Some(sub) => {
437 let pats =
438 vec::append(
439 vec::append(sub, br.pats.slice(0u, col)),
440 br.pats.slice(col + 1u, br.pats.len()));
441
442 let this = br.pats[col];
443 match this.node {
444 ast::pat_ident(_, path, None) => {
445 if pat_is_binding(dm, this) {
446 let binding_info =
447 br.data.bindings_map.get(
448 &path_to_ident(path));
449 Store(bcx, val, binding_info.llmatch);
450 }
451 }
452 _ => {}
453 }
454
455 result.push(@Match {pats: pats, data: br.data});
456 }
457 None => ()
458 }
459 }
460
461 debug!("result=%s", matches_to_str(bcx, result));
462
463 return result;
464 }
465
466 pub fn enter_default<'r>(bcx: block,
467 dm: DefMap,
468 m: &[@Match<'r>],
469 col: uint,
470 val: ValueRef)
471 -> ~[@Match<'r>] {
472 debug!("enter_default(bcx=%s, m=%s, col=%u, val=%?)",
473 bcx.to_str(),
474 matches_to_str(bcx, m),
475 col,
476 bcx.val_to_str(val));
477 let _indenter = indenter();
478
479 do enter_match(bcx, dm, m, col, val) |p| {
480 match p.node {
481 ast::pat_wild | ast::pat_tup(_) | ast::pat_struct(*) => Some(~[]),
482 ast::pat_ident(_, _, None) if pat_is_binding(dm, p) => Some(~[]),
483 _ => None
484 }
485 }
486 }
487
488 // <pcwalton> nmatsakis: what does enter_opt do?
489 // <pcwalton> in trans/match
490 // <pcwalton> trans/match.rs is like stumbling around in a dark cave
491 // <nmatsakis> pcwalton: the enter family of functions adjust the set of
492 // patterns as needed
493 // <nmatsakis> yeah, at some point I kind of achieved some level of
494 // understanding
495 // <nmatsakis> anyhow, they adjust the patterns given that something of that
496 // kind has been found
497 // <nmatsakis> pcwalton: ok, right, so enter_XXX() adjusts the patterns, as I
498 // said
499 // <nmatsakis> enter_match() kind of embodies the generic code
500 // <nmatsakis> it is provided with a function that tests each pattern to see
501 // if it might possibly apply and so forth
502 // <nmatsakis> so, if you have a pattern like {a: _, b: _, _} and one like _
503 // <nmatsakis> then _ would be expanded to (_, _)
504 // <nmatsakis> one spot for each of the sub-patterns
505 // <nmatsakis> enter_opt() is one of the more complex; it covers the fallible
506 // cases
507 // <nmatsakis> enter_rec_or_struct() or enter_tuple() are simpler, since they
508 // are infallible patterns
509 // <nmatsakis> so all patterns must either be records (resp. tuples) or
510 // wildcards
511
512 pub fn enter_opt<'r>(bcx: block,
513 m: &[@Match<'r>],
514 opt: &Opt,
515 col: uint,
516 variant_size: uint,
517 val: ValueRef)
518 -> ~[@Match<'r>] {
519 debug!("enter_opt(bcx=%s, m=%s, col=%u, val=%?)",
520 bcx.to_str(),
521 matches_to_str(bcx, m),
522 col,
523 bcx.val_to_str(val));
524 let _indenter = indenter();
525
526 let tcx = bcx.tcx();
527 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
528 do enter_match(bcx, tcx.def_map, m, col, val) |p| {
529 match p.node {
530 ast::pat_enum(*) |
531 ast::pat_ident(_, _, None) if pat_is_const(tcx.def_map, p) => {
532 let const_def = tcx.def_map.get_copy(&p.id);
533 let const_def_id = ast_util::def_id_of_def(const_def);
534 if opt_eq(tcx, &lit(ConstLit(const_def_id)), opt) {
535 Some(~[])
536 } else {
537 None
538 }
539 }
540 ast::pat_enum(_, ref subpats) => {
541 if opt_eq(tcx, &variant_opt(bcx, p.id), opt) {
542 match *subpats {
543 None => Some(vec::from_elem(variant_size, dummy)),
544 _ => copy *subpats
545 }
546 } else {
547 None
548 }
549 }
550 ast::pat_ident(_, _, None)
551 if pat_is_variant_or_struct(tcx.def_map, p) => {
552 if opt_eq(tcx, &variant_opt(bcx, p.id), opt) {
553 Some(~[])
554 } else {
555 None
556 }
557 }
558 ast::pat_lit(l) => {
559 if opt_eq(tcx, &lit(ExprLit(l)), opt) {Some(~[])} else {None}
560 }
561 ast::pat_range(l1, l2) => {
562 if opt_eq(tcx, &range(l1, l2), opt) {Some(~[])} else {None}
563 }
564 ast::pat_struct(_, ref field_pats, _) => {
565 if opt_eq(tcx, &variant_opt(bcx, p.id), opt) {
566 // Look up the struct variant ID.
567 let struct_id;
568 match tcx.def_map.get_copy(&p.id) {
569 ast::def_variant(_, found_struct_id) => {
570 struct_id = found_struct_id;
571 }
572 _ => {
573 tcx.sess.span_bug(p.span, "expected enum variant def");
574 }
575 }
576
577 // Reorder the patterns into the same order they were
578 // specified in the struct definition. Also fill in
579 // unspecified fields with dummy.
580 let mut reordered_patterns = ~[];
581 let r = ty::lookup_struct_fields(tcx, struct_id);
582 for r.iter().advance |field| {
583 match field_pats.iter().find_(|p| p.ident == field.ident) {
584 None => reordered_patterns.push(dummy),
585 Some(fp) => reordered_patterns.push(fp.pat)
586 }
587 }
588 Some(reordered_patterns)
589 } else {
590 None
591 }
592 }
593 ast::pat_vec(ref before, slice, ref after) => {
594 match slice {
595 Some(slice) => {
596 let n = before.len() + after.len();
597 let i = before.len();
598 if opt_eq(tcx, &vec_len_ge(n, i), opt) {
599 Some(vec::append_one(copy *before, slice) +
600 *after)
601 } else {
602 None
603 }
604 }
605 None => {
606 let n = before.len();
607 if opt_eq(tcx, &vec_len_eq(n), opt) {
608 Some(copy *before)
609 } else {
610 None
611 }
612 }
613 }
614 }
615 _ => {
616 assert_is_binding_or_wild(bcx, p);
617 Some(vec::from_elem(variant_size, dummy))
618 }
619 }
620 }
621 }
622
623 pub fn enter_rec_or_struct<'r>(bcx: block,
624 dm: DefMap,
625 m: &[@Match<'r>],
626 col: uint,
627 fields: &[ast::ident],
628 val: ValueRef)
629 -> ~[@Match<'r>] {
630 debug!("enter_rec_or_struct(bcx=%s, m=%s, col=%u, val=%?)",
631 bcx.to_str(),
632 matches_to_str(bcx, m),
633 col,
634 bcx.val_to_str(val));
635 let _indenter = indenter();
636
637 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
638 do enter_match(bcx, dm, m, col, val) |p| {
639 match p.node {
640 ast::pat_struct(_, ref fpats, _) => {
641 let mut pats = ~[];
642 for fields.iter().advance |fname| {
643 match fpats.iter().find_(|p| p.ident == *fname) {
644 None => pats.push(dummy),
645 Some(pat) => pats.push(pat.pat)
646 }
647 }
648 Some(pats)
649 }
650 _ => {
651 assert_is_binding_or_wild(bcx, p);
652 Some(vec::from_elem(fields.len(), dummy))
653 }
654 }
655 }
656 }
657
658 pub fn enter_tup<'r>(bcx: block,
659 dm: DefMap,
660 m: &[@Match<'r>],
661 col: uint,
662 val: ValueRef,
663 n_elts: uint)
664 -> ~[@Match<'r>] {
665 debug!("enter_tup(bcx=%s, m=%s, col=%u, val=%?)",
666 bcx.to_str(),
667 matches_to_str(bcx, m),
668 col,
669 bcx.val_to_str(val));
670 let _indenter = indenter();
671
672 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
673 do enter_match(bcx, dm, m, col, val) |p| {
674 match p.node {
675 ast::pat_tup(ref elts) => {
676 Some(copy *elts)
677 }
678 _ => {
679 assert_is_binding_or_wild(bcx, p);
680 Some(vec::from_elem(n_elts, dummy))
681 }
682 }
683 }
684 }
685
686 pub fn enter_tuple_struct<'r>(bcx: block,
687 dm: DefMap,
688 m: &[@Match<'r>],
689 col: uint,
690 val: ValueRef,
691 n_elts: uint)
692 -> ~[@Match<'r>] {
693 debug!("enter_tuple_struct(bcx=%s, m=%s, col=%u, val=%?)",
694 bcx.to_str(),
695 matches_to_str(bcx, m),
696 col,
697 bcx.val_to_str(val));
698 let _indenter = indenter();
699
700 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
701 do enter_match(bcx, dm, m, col, val) |p| {
702 match p.node {
703 ast::pat_enum(_, Some(ref elts)) => Some(copy *elts),
704 _ => {
705 assert_is_binding_or_wild(bcx, p);
706 Some(vec::from_elem(n_elts, dummy))
707 }
708 }
709 }
710 }
711
712 pub fn enter_box<'r>(bcx: block,
713 dm: DefMap,
714 m: &[@Match<'r>],
715 col: uint,
716 val: ValueRef)
717 -> ~[@Match<'r>] {
718 debug!("enter_box(bcx=%s, m=%s, col=%u, val=%?)",
719 bcx.to_str(),
720 matches_to_str(bcx, m),
721 col,
722 bcx.val_to_str(val));
723 let _indenter = indenter();
724
725 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
726 do enter_match(bcx, dm, m, col, val) |p| {
727 match p.node {
728 ast::pat_box(sub) => {
729 Some(~[sub])
730 }
731 _ => {
732 assert_is_binding_or_wild(bcx, p);
733 Some(~[dummy])
734 }
735 }
736 }
737 }
738
739 pub fn enter_uniq<'r>(bcx: block,
740 dm: DefMap,
741 m: &[@Match<'r>],
742 col: uint,
743 val: ValueRef)
744 -> ~[@Match<'r>] {
745 debug!("enter_uniq(bcx=%s, m=%s, col=%u, val=%?)",
746 bcx.to_str(),
747 matches_to_str(bcx, m),
748 col,
749 bcx.val_to_str(val));
750 let _indenter = indenter();
751
752 let dummy = @ast::pat {id: 0, node: ast::pat_wild, span: dummy_sp()};
753 do enter_match(bcx, dm, m, col, val) |p| {
754 match p.node {
755 ast::pat_uniq(sub) => {
756 Some(~[sub])
757 }
758 _ => {
759 assert_is_binding_or_wild(bcx, p);
760 Some(~[dummy])
761 }
762 }
763 }
764 }
765
766 pub fn enter_region<'r>(bcx: block,
767 dm: DefMap,
768 m: &[@Match<'r>],
769 col: uint,
770 val: ValueRef)
771 -> ~[@Match<'r>] {
772 debug!("enter_region(bcx=%s, m=%s, col=%u, val=%?)",
773 bcx.to_str(),
774 matches_to_str(bcx, m),
775 col,
776 bcx.val_to_str(val));
777 let _indenter = indenter();
778
779 let dummy = @ast::pat { id: 0, node: ast::pat_wild, span: dummy_sp() };
780 do enter_match(bcx, dm, m, col, val) |p| {
781 match p.node {
782 ast::pat_region(sub) => {
783 Some(~[sub])
784 }
785 _ => {
786 assert_is_binding_or_wild(bcx, p);
787 Some(~[dummy])
788 }
789 }
790 }
791 }
792
793 // Returns the options in one column of matches. An option is something that
794 // needs to be conditionally matched at runtime; for example, the discriminant
795 // on a set of enum variants or a literal.
796 pub fn get_options(bcx: block, m: &[@Match], col: uint) -> ~[Opt] {
797 let ccx = bcx.ccx();
798 fn add_to_set(tcx: ty::ctxt, set: &mut ~[Opt], val: Opt) {
799 if set.iter().any_(|l| opt_eq(tcx, l, &val)) {return;}
800 set.push(val);
801 }
802
803 let mut found = ~[];
804 for m.iter().advance |br| {
805 let cur = br.pats[col];
806 match cur.node {
807 ast::pat_lit(l) => {
808 add_to_set(ccx.tcx, &mut found, lit(ExprLit(l)));
809 }
810 ast::pat_ident(*) => {
811 // This is one of: an enum variant, a unit-like struct, or a
812 // variable binding.
813 match ccx.tcx.def_map.find(&cur.id) {
814 Some(&ast::def_variant(*)) => {
815 add_to_set(ccx.tcx, &mut found,
816 variant_opt(bcx, cur.id));
817 }
818 Some(&ast::def_struct(*)) => {
819 add_to_set(ccx.tcx, &mut found,
820 lit(UnitLikeStructLit(cur.id)));
821 }
822 Some(&ast::def_static(const_did, false)) => {
823 add_to_set(ccx.tcx, &mut found,
824 lit(ConstLit(const_did)));
825 }
826 _ => {}
827 }
828 }
829 ast::pat_enum(*) | ast::pat_struct(*) => {
830 // This could be one of: a tuple-like enum variant, a
831 // struct-like enum variant, or a struct.
832 match ccx.tcx.def_map.find(&cur.id) {
833 Some(&ast::def_fn(*)) |
834 Some(&ast::def_variant(*)) => {
835 add_to_set(ccx.tcx, &mut found,
836 variant_opt(bcx, cur.id));
837 }
838 Some(&ast::def_static(const_did, false)) => {
839 add_to_set(ccx.tcx, &mut found,
840 lit(ConstLit(const_did)));
841 }
842 _ => {}
843 }
844 }
845 ast::pat_range(l1, l2) => {
846 add_to_set(ccx.tcx, &mut found, range(l1, l2));
847 }
848 ast::pat_vec(ref before, slice, ref after) => {
849 let opt = match slice {
850 None => vec_len_eq(before.len()),
851 Some(_) => vec_len_ge(before.len() + after.len(),
852 before.len())
853 };
854 add_to_set(ccx.tcx, &mut found, opt);
855 }
856 _ => {}
857 }
858 }
859 return found;
860 }
861
862 pub struct ExtractedBlock {
863 vals: ~[ValueRef],
864 bcx: block
865 }
866
867 pub fn extract_variant_args(bcx: block,
868 repr: &adt::Repr,
869 disr_val: int,
870 val: ValueRef)
871 -> ExtractedBlock {
872 let _icx = push_ctxt("match::extract_variant_args");
873 let args = do vec::from_fn(adt::num_args(repr, disr_val)) |i| {
874 adt::trans_field_ptr(bcx, repr, val, disr_val, i)
875 };
876
877 ExtractedBlock { vals: args, bcx: bcx }
878 }
879
880 fn match_datum(bcx: block, val: ValueRef, pat_id: ast::node_id) -> Datum {
881 //! Helper for converting from the ValueRef that we pass around in
882 //! the match code, which is always by ref, into a Datum. Eventually
883 //! we should just pass around a Datum and be done with it.
884
885 let ty = node_id_type(bcx, pat_id);
886 Datum {val: val, ty: ty, mode: datum::ByRef(RevokeClean)}
887 }
888
889
890 pub fn extract_vec_elems(bcx: block,
891 pat_span: span,
892 pat_id: ast::node_id,
893 elem_count: uint,
894 slice: Option<uint>,
895 val: ValueRef,
896 count: ValueRef)
897 -> ExtractedBlock {
898 let _icx = push_ctxt("match::extract_vec_elems");
899 let vec_datum = match_datum(bcx, val, pat_id);
900 let (bcx, base, len) = vec_datum.get_vec_base_and_len(bcx, pat_span,
901 pat_id, 0);
902 let vt = tvec::vec_types(bcx, node_id_type(bcx, pat_id));
903
904 let mut elems = do vec::from_fn(elem_count) |i| {
905 match slice {
906 None => GEPi(bcx, base, [i]),
907 Some(n) if i < n => GEPi(bcx, base, [i]),
908 Some(n) if i > n => {
909 InBoundsGEP(bcx, base, [
910 Sub(bcx, count,
911 C_int(bcx.ccx(), (elem_count - i) as int))])
912 }
913 _ => unsafe { llvm::LLVMGetUndef(vt.llunit_ty.to_ref()) }
914 }
915 };
916 if slice.is_some() {
917 let n = slice.get();
918 let slice_offset = Mul(bcx, vt.llunit_size,
919 C_int(bcx.ccx(), n as int)
920 );
921 let slice_begin = tvec::pointer_add(bcx, base, slice_offset);
922 let slice_len_offset = Mul(bcx, vt.llunit_size,
923 C_int(bcx.ccx(), (elem_count - 1u) as int)
924 );
925 let slice_len = Sub(bcx, len, slice_len_offset);
926 let slice_ty = ty::mk_evec(bcx.tcx(),
927 ty::mt {ty: vt.unit_ty, mutbl: ast::m_imm},
928 ty::vstore_slice(ty::re_static)
929 );
930 let scratch = scratch_datum(bcx, slice_ty, false);
931 Store(bcx, slice_begin,
932 GEPi(bcx, scratch.val, [0u, abi::slice_elt_base])
933 );
934 Store(bcx, slice_len,
935 GEPi(bcx, scratch.val, [0u, abi::slice_elt_len])
936 );
937 elems[n] = scratch.val;
938 scratch.add_clean(bcx);
939 }
940
941 ExtractedBlock { vals: elems, bcx: bcx }
942 }
943
944 // NB: This function does not collect fields from struct-like enum variants.
945 pub fn collect_record_or_struct_fields(bcx: block,
946 m: &[@Match],
947 col: uint)
948 -> ~[ast::ident] {
949 let mut fields: ~[ast::ident] = ~[];
950 for m.iter().advance |br| {
951 match br.pats[col].node {
952 ast::pat_struct(_, ref fs, _) => {
953 match ty::get(node_id_type(bcx, br.pats[col].id)).sty {
954 ty::ty_struct(*) => extend(&mut fields, *fs),
955 _ => ()
956 }
957 }
958 _ => ()
959 }
960 }
961 return fields;
962
963 fn extend(idents: &mut ~[ast::ident], field_pats: &[ast::field_pat]) {
964 for field_pats.iter().advance |field_pat| {
965 let field_ident = field_pat.ident;
966 if !idents.iter().any_(|x| *x == field_ident) {
967 idents.push(field_ident);
968 }
969 }
970 }
971 }
972
973 pub fn pats_require_rooting(bcx: block,
974 m: &[@Match],
975 col: uint)
976 -> bool {
977 do m.iter().any_ |br| {
978 let pat_id = br.pats[col].id;
979 let key = root_map_key {id: pat_id, derefs: 0u };
980 bcx.ccx().maps.root_map.contains_key(&key)
981 }
982 }
983
984 pub fn root_pats_as_necessary(mut bcx: block,
985 m: &[@Match],
986 col: uint,
987 val: ValueRef)
988 -> block {
989 for m.iter().advance |br| {
990 let pat_id = br.pats[col].id;
991 if pat_id != 0 {
992 let datum = Datum {val: val, ty: node_id_type(bcx, pat_id),
993 mode: ByRef(ZeroMem)};
994 bcx = datum.root_and_write_guard(bcx, br.pats[col].span, pat_id, 0);
995 }
996 }
997 return bcx;
998 }
999
1000 // Macro for deciding whether any of the remaining matches fit a given kind of
1001 // pattern. Note that, because the macro is well-typed, either ALL of the
1002 // matches should fit that sort of pattern or NONE (however, some of the
1003 // matches may be wildcards like _ or identifiers).
1004 macro_rules! any_pat (
1005 ($m:expr, $pattern:pat) => (
1006 do ($m).iter().any_ |br| {
1007 match br.pats[col].node {
1008 $pattern => true,
1009 _ => false
1010 }
1011 }
1012 )
1013 )
1014
1015 pub fn any_box_pat(m: &[@Match], col: uint) -> bool {
1016 any_pat!(m, ast::pat_box(_))
1017 }
1018
1019 pub fn any_uniq_pat(m: &[@Match], col: uint) -> bool {
1020 any_pat!(m, ast::pat_uniq(_))
1021 }
1022
1023 pub fn any_region_pat(m: &[@Match], col: uint) -> bool {
1024 any_pat!(m, ast::pat_region(_))
1025 }
1026
1027 pub fn any_tup_pat(m: &[@Match], col: uint) -> bool {
1028 any_pat!(m, ast::pat_tup(_))
1029 }
1030
1031 pub fn any_tuple_struct_pat(bcx: block, m: &[@Match], col: uint) -> bool {
1032 do m.iter().any_ |br| {
1033 let pat = br.pats[col];
1034 match pat.node {
1035 ast::pat_enum(_, Some(_)) => {
1036 match bcx.tcx().def_map.find(&pat.id) {
1037 Some(&ast::def_fn(*)) |
1038 Some(&ast::def_struct(*)) => true,
1039 _ => false
1040 }
1041 }
1042 _ => false
1043 }
1044 }
1045 }
1046
1047 pub type mk_fail = @fn() -> BasicBlockRef;
1048
1049 pub fn pick_col(m: &[@Match]) -> uint {
1050 fn score(p: &ast::pat) -> uint {
1051 match p.node {
1052 ast::pat_lit(_) | ast::pat_enum(_, _) | ast::pat_range(_, _) => 1u,
1053 ast::pat_ident(_, _, Some(p)) => score(p),
1054 _ => 0u
1055 }
1056 }
1057 let mut scores = vec::from_elem(m[0].pats.len(), 0u);
1058 for m.iter().advance |br| {
1059 let mut i = 0u;
1060 for br.pats.iter().advance |p| { scores[i] += score(*p); i += 1u; }
1061 }
1062 let mut max_score = 0u;
1063 let mut best_col = 0u;
1064 let mut i = 0u;
1065 for scores.iter().advance |score| {
1066 let score = *score;
1067
1068 // Irrefutable columns always go first, they'd only be duplicated in
1069 // the branches.
1070 if score == 0u { return i; }
1071 // If no irrefutable ones are found, we pick the one with the biggest
1072 // branching factor.
1073 if score > max_score { max_score = score; best_col = i; }
1074 i += 1u;
1075 }
1076 return best_col;
1077 }
1078
1079 #[deriving(Eq)]
1080 pub enum branch_kind { no_branch, single, switch, compare, compare_vec_len, }
1081
1082 // Compiles a comparison between two things.
1083 //
1084 // NB: This must produce an i1, not a Rust bool (i8).
1085 pub fn compare_values(cx: block,
1086 lhs: ValueRef,
1087 rhs: ValueRef,
1088 rhs_t: ty::t)
1089 -> Result {
1090 let _icx = push_ctxt("compare_values");
1091 if ty::type_is_scalar(rhs_t) {
1092 let rs = compare_scalar_types(cx, lhs, rhs, rhs_t, ast::eq);
1093 return rslt(rs.bcx, rs.val);
1094 }
1095
1096 match ty::get(rhs_t).sty {
1097 ty::ty_estr(ty::vstore_uniq) => {
1098 let scratch_result = scratch_datum(cx, ty::mk_bool(), false);
1099 let scratch_lhs = alloca(cx, val_ty(lhs));
1100 Store(cx, lhs, scratch_lhs);
1101 let scratch_rhs = alloca(cx, val_ty(rhs));
1102 Store(cx, rhs, scratch_rhs);
1103 let did = cx.tcx().lang_items.uniq_str_eq_fn();
1104 let bcx = callee::trans_lang_call(cx, did, [scratch_lhs, scratch_rhs],
1105 expr::SaveIn(scratch_result.val));
1106 let result = scratch_result.to_result(bcx);
1107 Result {
1108 bcx: result.bcx,
1109 val: bool_to_i1(result.bcx, result.val)
1110 }
1111 }
1112 ty::ty_estr(_) => {
1113 let scratch_result = scratch_datum(cx, ty::mk_bool(), false);
1114 let did = cx.tcx().lang_items.str_eq_fn();
1115 let bcx = callee::trans_lang_call(cx, did, [lhs, rhs],
1116 expr::SaveIn(scratch_result.val));
1117 let result = scratch_result.to_result(bcx);
1118 Result {
1119 bcx: result.bcx,
1120 val: bool_to_i1(result.bcx, result.val)
1121 }
1122 }
1123 _ => {
1124 cx.tcx().sess.bug("only scalars and strings supported in \
1125 compare_values");
1126 }
1127 }
1128 }
1129
1130 fn store_non_ref_bindings(bcx: block,
1131 bindings_map: &BindingsMap,
1132 mut opt_temp_cleanups: Option<&mut ~[ValueRef]>)
1133 -> block
1134 {
1135 /*!
1136 *
1137 * For each copy/move binding, copy the value from the value
1138 * being matched into its final home. This code executes once
1139 * one of the patterns for a given arm has completely matched.
1140 * It adds temporary cleanups to the `temp_cleanups` array,
1141 * if one is provided.
1142 */
1143
1144 let mut bcx = bcx;
1145 for bindings_map.each_value |&binding_info| {
1146 match binding_info.trmode {
1147 TrByValue(is_move, lldest) => {
1148 let llval = Load(bcx, binding_info.llmatch); // get a T*
1149 let datum = Datum {val: llval, ty: binding_info.ty,
1150 mode: ByRef(ZeroMem)};
1151 bcx = {
1152 if is_move {
1153 datum.move_to(bcx, INIT, lldest)
1154 } else {
1155 datum.copy_to(bcx, INIT, lldest)
1156 }
1157 };
1158
1159 do opt_temp_cleanups.mutate |temp_cleanups| {
1160 add_clean_temp_mem(bcx, lldest, binding_info.ty);
1161 temp_cleanups.push(lldest);
1162 temp_cleanups
1163 }
1164 }
1165 TrByRef => {}
1166 }
1167 }
1168 return bcx;
1169 }
1170
1171 fn insert_lllocals(bcx: block,
1172 bindings_map: &BindingsMap,
1173 binding_mode: IrrefutablePatternBindingMode,
1174 add_cleans: bool) -> block {
1175 /*!
1176 * For each binding in `data.bindings_map`, adds an appropriate entry into
1177 * the `fcx.lllocals` map. If add_cleans is true, then adds cleanups for
1178 * the bindings.
1179 */
1180
1181 let llmap = match binding_mode {
1182 BindLocal => bcx.fcx.lllocals,
1183 BindArgument => bcx.fcx.llargs
1184 };
1185
1186 for bindings_map.each_value |&binding_info| {
1187 let llval = match binding_info.trmode {
1188 // By value bindings: use the stack slot that we
1189 // copied/moved the value into
1190 TrByValue(_, lldest) => {
1191 if add_cleans {
1192 add_clean(bcx, lldest, binding_info.ty);
1193 }
1194
1195 lldest
1196 }
1197
1198 // By ref binding: use the ptr into the matched value
1199 TrByRef => {
1200 binding_info.llmatch
1201 }
1202 };
1203
1204 debug!("binding %? to %s", binding_info.id, bcx.val_to_str(llval));
1205 llmap.insert(binding_info.id, llval);
1206 }
1207 return bcx;
1208 }
1209
1210 pub fn compile_guard(bcx: block,
1211 guard_expr: @ast::expr,
1212 data: &ArmData,
1213 m: &[@Match],
1214 vals: &[ValueRef],
1215 chk: Option<mk_fail>)
1216 -> block {
1217 debug!("compile_guard(bcx=%s, guard_expr=%s, m=%s, vals=%?)",
1218 bcx.to_str(),
1219 bcx.expr_to_str(guard_expr),
1220 matches_to_str(bcx, m),
1221 vals.map(|v| bcx.val_to_str(*v)));
1222 let _indenter = indenter();
1223
1224 let mut bcx = bcx;
1225 let mut temp_cleanups = ~[];
1226 bcx = store_non_ref_bindings(bcx, &data.bindings_map, Some(&mut temp_cleanups));
1227 bcx = insert_lllocals(bcx, &data.bindings_map, BindLocal, false);
1228
1229 let val = unpack_result!(bcx, {
1230 do with_scope_result(bcx, guard_expr.info(),
1231 "guard") |bcx| {
1232 expr::trans_to_datum(bcx, guard_expr).to_result()
1233 }
1234 });
1235 let val = bool_to_i1(bcx, val);
1236
1237 // Revoke the temp cleanups now that the guard successfully executed.
1238 for temp_cleanups.iter().advance |llval| {
1239 revoke_clean(bcx, *llval);
1240 }
1241
1242 return do with_cond(bcx, Not(bcx, val)) |bcx| {
1243 // Guard does not match: free the values we copied,
1244 // and remove all bindings from the lllocals table
1245 let bcx = drop_bindings(bcx, data);
1246 compile_submatch(bcx, m, vals, chk);
1247 bcx
1248 };
1249
1250 fn drop_bindings(bcx: block, data: &ArmData) -> block {
1251 let mut bcx = bcx;
1252 for data.bindings_map.each_value |&binding_info| {
1253 match binding_info.trmode {
1254 TrByValue(_, llval) => {
1255 bcx = glue::drop_ty(bcx, llval, binding_info.ty);
1256 }
1257 TrByRef => {}
1258 }
1259 bcx.fcx.lllocals.remove(&binding_info.id);
1260 }
1261 return bcx;
1262 }
1263 }
1264
1265 pub fn compile_submatch(bcx: block,
1266 m: &[@Match],
1267 vals: &[ValueRef],
1268 chk: Option<mk_fail>) {
1269 debug!("compile_submatch(bcx=%s, m=%s, vals=%?)",
1270 bcx.to_str(),
1271 matches_to_str(bcx, m),
1272 vals.map(|v| bcx.val_to_str(*v)));
1273 let _indenter = indenter();
1274
1275 /*
1276 For an empty match, a fall-through case must exist
1277 */
1278 assert!((m.len() > 0u || chk.is_some()));
1279 let _icx = push_ctxt("match::compile_submatch");
1280 let mut bcx = bcx;
1281 let tcx = bcx.tcx();
1282 let dm = tcx.def_map;
1283 if m.len() == 0u {
1284 Br(bcx, chk.get()());
1285 return;
1286 }
1287 if m[0].pats.len() == 0u {
1288 let data = m[0].data;
1289 match data.arm.guard {
1290 Some(guard_expr) => {
1291 bcx = compile_guard(bcx, guard_expr, m[0].data,
1292 m.slice(1, m.len()),
1293 vals, chk);
1294 }
1295 _ => ()
1296 }
1297 Br(bcx, data.bodycx.llbb);
1298 return;
1299 }
1300
1301 let col = pick_col(m);
1302 let val = vals[col];
1303 let m = {
1304 if has_nested_bindings(m, col) {
1305 expand_nested_bindings(bcx, m, col, val)
1306 } else {
1307 m.to_owned()
1308 }
1309 };
1310
1311 let vals_left = vec::append(vals.slice(0u, col).to_owned(),
1312 vals.slice(col + 1u, vals.len()));
1313 let ccx = bcx.fcx.ccx;
1314 let mut pat_id = 0;
1315 let mut pat_span = dummy_sp();
1316 for m.iter().advance |br| {
1317 // Find a real id (we're adding placeholder wildcard patterns, but
1318 // each column is guaranteed to have at least one real pattern)
1319 if pat_id == 0 {
1320 pat_id = br.pats[col].id;
1321 pat_span = br.pats[col].span;
1322 }
1323 }
1324
1325 // If we are not matching against an `@T`, we should not be
1326 // required to root any values.
1327 assert!(any_box_pat(m, col) || !pats_require_rooting(bcx, m, col));
1328
1329 let rec_fields = collect_record_or_struct_fields(bcx, m, col);
1330 if rec_fields.len() > 0 {
1331 let pat_ty = node_id_type(bcx, pat_id);
1332 let pat_repr = adt::represent_type(bcx.ccx(), pat_ty);
1333 do expr::with_field_tys(tcx, pat_ty, None) |discr, field_tys| {
1334 let rec_vals = rec_fields.map(|field_name| {
1335 let ix = ty::field_idx_strict(tcx, *field_name, field_tys);
1336 adt::trans_field_ptr(bcx, pat_repr, val, discr, ix)
1337 });
1338 compile_submatch(
1339 bcx,
1340 enter_rec_or_struct(bcx, dm, m, col, rec_fields, val),
1341 vec::append(rec_vals, vals_left),
1342 chk);
1343 }
1344 return;
1345 }
1346
1347 if any_tup_pat(m, col) {
1348 let tup_ty = node_id_type(bcx, pat_id);
1349 let tup_repr = adt::represent_type(bcx.ccx(), tup_ty);
1350 let n_tup_elts = match ty::get(tup_ty).sty {
1351 ty::ty_tup(ref elts) => elts.len(),
1352 _ => ccx.sess.bug("non-tuple type in tuple pattern")
1353 };
1354 let tup_vals = do vec::from_fn(n_tup_elts) |i| {
1355 adt::trans_field_ptr(bcx, tup_repr, val, 0, i)
1356 };
1357 compile_submatch(bcx, enter_tup(bcx, dm, m, col, val, n_tup_elts),
1358 vec::append(tup_vals, vals_left), chk);
1359 return;
1360 }
1361
1362 if any_tuple_struct_pat(bcx, m, col) {
1363 let struct_ty = node_id_type(bcx, pat_id);
1364 let struct_element_count;
1365 match ty::get(struct_ty).sty {
1366 ty::ty_struct(struct_id, _) => {
1367 struct_element_count =
1368 ty::lookup_struct_fields(tcx, struct_id).len();
1369 }
1370 _ => {
1371 ccx.sess.bug("non-struct type in tuple struct pattern");
1372 }
1373 }
1374
1375 let struct_repr = adt::represent_type(bcx.ccx(), struct_ty);
1376 let llstructvals = do vec::from_fn(struct_element_count) |i| {
1377 adt::trans_field_ptr(bcx, struct_repr, val, 0, i)
1378 };
1379 compile_submatch(bcx,
1380 enter_tuple_struct(bcx, dm, m, col, val,
1381 struct_element_count),
1382 vec::append(llstructvals, vals_left),
1383 chk);
1384 return;
1385 }
1386
1387 // Unbox in case of a box field
1388 if any_box_pat(m, col) {
1389 bcx = root_pats_as_necessary(bcx, m, col, val);
1390 let llbox = Load(bcx, val);
1391 let unboxed = GEPi(bcx, llbox, [0u, abi::box_field_body]);
1392 compile_submatch(bcx, enter_box(bcx, dm, m, col, val),
1393 vec::append(~[unboxed], vals_left), chk);
1394 return;
1395 }
1396
1397 if any_uniq_pat(m, col) {
1398 let llbox = Load(bcx, val);
1399 let unboxed = GEPi(bcx, llbox, [0u, abi::box_field_body]);
1400 compile_submatch(bcx, enter_uniq(bcx, dm, m, col, val),
1401 vec::append(~[unboxed], vals_left), chk);
1402 return;
1403 }
1404
1405 if any_region_pat(m, col) {
1406 let loaded_val = Load(bcx, val);
1407 compile_submatch(bcx, enter_region(bcx, dm, m, col, val),
1408 vec::append(~[loaded_val], vals_left), chk);
1409 return;
1410 }
1411
1412 // Decide what kind of branch we need
1413 let opts = get_options(bcx, m, col);
1414 let mut kind = no_branch;
1415 let mut test_val = val;
1416 if opts.len() > 0u {
1417 match opts[0] {
1418 var(_, repr) => {
1419 let (the_kind, val_opt) = adt::trans_switch(bcx, repr, val);
1420 kind = the_kind;
1421 for val_opt.iter().advance |&tval| { test_val = tval; }
1422 }
1423 lit(_) => {
1424 let pty = node_id_type(bcx, pat_id);
1425 test_val = load_if_immediate(bcx, val, pty);
1426 kind = if ty::type_is_integral(pty) { switch }
1427 else { compare };
1428 }
1429 range(_, _) => {
1430 test_val = Load(bcx, val);
1431 kind = compare;
1432 },
1433 vec_len_eq(*) | vec_len_ge(*) => {
1434 let vt = tvec::vec_types(bcx, node_id_type(bcx, pat_id));
1435 let unboxed = load_if_immediate(bcx, val, vt.vec_ty);
1436 let (_, len) = tvec::get_base_and_len(
1437 bcx, unboxed, vt.vec_ty
1438 );
1439 test_val = SDiv(bcx, len, vt.llunit_size);
1440 kind = compare_vec_len;
1441 }
1442 }
1443 }
1444 for opts.iter().advance |o| {
1445 match *o {
1446 range(_, _) => { kind = compare; break }
1447 _ => ()
1448 }
1449 }
1450 let else_cx = match kind {
1451 no_branch | single => bcx,
1452 _ => sub_block(bcx, "match_else")
1453 };
1454 let sw = if kind == switch {
1455 Switch(bcx, test_val, else_cx.llbb, opts.len())
1456 } else {
1457 C_int(ccx, 0) // Placeholder for when not using a switch
1458 };
1459
1460 let defaults = enter_default(else_cx, dm, m, col, val);
1461 let exhaustive = chk.is_none() && defaults.len() == 0u;
1462 let len = opts.len();
1463 let mut i = 0u;
1464
1465 // Compile subtrees for each option
1466 for opts.iter().advance |opt| {
1467 i += 1u;
1468 let mut opt_cx = else_cx;
1469 if !exhaustive || i < len {
1470 opt_cx = sub_block(bcx, "match_case");
1471 match kind {
1472 single => Br(bcx, opt_cx.llbb),
1473 switch => {
1474 match trans_opt(bcx, opt) {
1475 single_result(r) => {
1476 unsafe {
1477 llvm::LLVMAddCase(sw, r.val, opt_cx.llbb);
1478 bcx = r.bcx;
1479 }
1480 }
1481 _ => {
1482 bcx.sess().bug(
1483 "in compile_submatch, expected \
1484 trans_opt to return a single_result")
1485 }
1486 }
1487 }
1488 compare => {
1489 let t = node_id_type(bcx, pat_id);
1490 let Result {bcx: after_cx, val: matches} = {
1491 do with_scope_result(bcx, None,
1492 "compare_scope") |bcx| {
1493 match trans_opt(bcx, opt) {
1494 single_result(
1495 Result {bcx, val}) => {
1496 compare_values(bcx, test_val, val, t)
1497 }
1498 lower_bound(
1499 Result {bcx, val}) => {
1500 compare_scalar_types(
1501 bcx, test_val, val,
1502 t, ast::ge)
1503 }
1504 range_result(
1505 Result {val: vbegin, _},
1506 Result {bcx, val: vend}) => {
1507 let Result {bcx, val: llge} =
1508 compare_scalar_types(
1509 bcx, test_val,
1510 vbegin, t, ast::ge);
1511 let Result {bcx, val: llle} =
1512 compare_scalar_types(
1513 bcx, test_val, vend,
1514 t, ast::le);
1515 rslt(bcx, And(bcx, llge, llle))
1516 }
1517 }
1518 }
1519 };
1520 bcx = sub_block(after_cx, "compare_next");
1521 CondBr(after_cx, matches, opt_cx.llbb, bcx.llbb);
1522 }
1523 compare_vec_len => {
1524 let Result {bcx: after_cx, val: matches} = {
1525 do with_scope_result(bcx, None,
1526 "compare_vec_len_scope") |bcx| {
1527 match trans_opt(bcx, opt) {
1528 single_result(
1529 Result {bcx, val}) => {
1530 let value = compare_scalar_values(
1531 bcx, test_val, val,
1532 signed_int, ast::eq);
1533 rslt(bcx, value)
1534 }
1535 lower_bound(
1536 Result {bcx, val: val}) => {
1537 let value = compare_scalar_values(
1538 bcx, test_val, val,
1539 signed_int, ast::ge);
1540 rslt(bcx, value)
1541 }
1542 range_result(
1543 Result {val: vbegin, _},
1544 Result {bcx, val: vend}) => {
1545 let llge =
1546 compare_scalar_values(
1547 bcx, test_val,
1548 vbegin, signed_int, ast::ge);
1549 let llle =
1550 compare_scalar_values(
1551 bcx, test_val, vend,
1552 signed_int, ast::le);
1553 rslt(bcx, And(bcx, llge, llle))
1554 }
1555 }
1556 }
1557 };
1558 bcx = sub_block(after_cx, "compare_vec_len_next");
1559 CondBr(after_cx, matches, opt_cx.llbb, bcx.llbb);
1560 }
1561 _ => ()
1562 }
1563 } else if kind == compare || kind == compare_vec_len {
1564 Br(bcx, else_cx.llbb);
1565 }
1566
1567 let mut size = 0u;
1568 let mut unpacked = ~[];
1569 match *opt {
1570 var(disr_val, repr) => {
1571 let ExtractedBlock {vals: argvals, bcx: new_bcx} =
1572 extract_variant_args(opt_cx, repr, disr_val, val);
1573 size = argvals.len();
1574 unpacked = argvals;
1575 opt_cx = new_bcx;
1576 }
1577 vec_len_eq(n) | vec_len_ge(n, _) => {
1578 let n = match *opt {
1579 vec_len_ge(*) => n + 1u,
1580 _ => n
1581 };
1582 let slice = match *opt {
1583 vec_len_ge(_, i) => Some(i),
1584 _ => None
1585 };
1586 let args = extract_vec_elems(opt_cx, pat_span, pat_id, n, slice,
1587 val, test_val);
1588 size = args.vals.len();
1589 unpacked = /*bad*/copy args.vals;
1590 opt_cx = args.bcx;
1591 }
1592 lit(_) | range(_, _) => ()
1593 }
1594 let opt_ms = enter_opt(opt_cx, m, opt, col, size, val);
1595 let opt_vals = vec::append(unpacked, vals_left);
1596 compile_submatch(opt_cx, opt_ms, opt_vals, chk);
1597 }
1598
1599 // Compile the fall-through case, if any
1600 if !exhaustive {
1601 if kind == compare || kind == compare_vec_len {
1602 Br(bcx, else_cx.llbb);
1603 }
1604 if kind != single {
1605 compile_submatch(else_cx, defaults, vals_left, chk);
1606 }
1607 }
1608 }
1609
1610 pub fn trans_match(bcx: block,
1611 match_expr: &ast::expr,
1612 discr_expr: @ast::expr,
1613 arms: ~[ast::arm],
1614 dest: Dest) -> block {
1615 let _icx = push_ctxt("match::trans_match");
1616 do with_scope(bcx, match_expr.info(), "match") |bcx| {
1617 trans_match_inner(bcx, discr_expr, arms, dest)
1618 }
1619 }
1620
1621 fn create_bindings_map(bcx: block, pat: @ast::pat) -> BindingsMap {
1622 // Create the bindings map, which is a mapping from each binding name
1623 // to an alloca() that will be the value for that local variable.
1624 // Note that we use the names because each binding will have many ids
1625 // from the various alternatives.
1626 let ccx = bcx.ccx();
1627 let tcx = bcx.tcx();
1628 let mut bindings_map = HashMap::new();
1629 do pat_bindings(tcx.def_map, pat) |bm, p_id, _s, path| {
1630 let ident = path_to_ident(path);
1631 let variable_ty = node_id_type(bcx, p_id);
1632 let llvariable_ty = type_of::type_of(ccx, variable_ty);
1633
1634 let llmatch;
1635 let trmode;
1636 match bm {
1637 ast::bind_infer => {
1638 // in this case, the final type of the variable will be T,
1639 // but during matching we need to store a *T as explained
1640 // above
1641 let is_move = ccx.maps.moves_map.contains(&p_id);
1642 llmatch = alloca(bcx, llvariable_ty.ptr_to());
1643 trmode = TrByValue(is_move, alloca(bcx, llvariable_ty));
1644 }
1645 ast::bind_by_ref(_) => {
1646 llmatch = alloca(bcx, llvariable_ty);
1647 trmode = TrByRef;
1648 }
1649 };
1650 bindings_map.insert(ident, BindingInfo {
1651 llmatch: llmatch, trmode: trmode,
1652 id: p_id, ty: variable_ty
1653 });
1654 }
1655 return bindings_map;
1656 }
1657
1658 pub fn trans_match_inner(scope_cx: block,
1659 discr_expr: @ast::expr,
1660 arms: &[ast::arm],
1661 dest: Dest) -> block {
1662 let _icx = push_ctxt("match::trans_match_inner");
1663 let mut bcx = scope_cx;
1664 let tcx = bcx.tcx();
1665
1666 let discr_datum = unpack_datum!(bcx, {
1667 expr::trans_to_datum(bcx, discr_expr)
1668 });
1669 if bcx.unreachable {
1670 return bcx;
1671 }
1672
1673 let mut arm_datas = ~[];
1674 let mut matches = ~[];
1675 for arms.iter().advance |arm| {
1676 let body = scope_block(bcx, arm.body.info(), "case_body");
1677 let bindings_map = create_bindings_map(bcx, arm.pats[0]);
1678 let arm_data = @ArmData {bodycx: body,
1679 arm: arm,
1680 bindings_map: bindings_map};
1681 arm_datas.push(arm_data);
1682 for arm.pats.iter().advance |p| {
1683 matches.push(@Match {pats: ~[*p], data: arm_data});
1684 }
1685 }
1686
1687 let t = node_id_type(bcx, discr_expr.id);
1688 let chk = {
1689 if ty::type_is_empty(tcx, t) {
1690 // Special case for empty types
1691 let fail_cx = @mut None;
1692 let f: mk_fail = || mk_fail(scope_cx, discr_expr.span,
1693 @"scrutinizing value that can't exist", fail_cx);
1694 Some(f)
1695 } else {
1696 None
1697 }
1698 };
1699 let lldiscr = discr_datum.to_zeroable_ref_llval(bcx);
1700 compile_submatch(bcx, matches, [lldiscr], chk);
1701
1702 let mut arm_cxs = ~[];
1703 for arm_datas.iter().advance |arm_data| {
1704 let mut bcx = arm_data.bodycx;
1705
1706 // If this arm has a guard, then the various by-value bindings have
1707 // already been copied into their homes. If not, we do it here. This
1708 // is just to reduce code space. See extensive comment at the start
1709 // of the file for more details.
1710 if arm_data.arm.guard.is_none() {
1711 bcx = store_non_ref_bindings(bcx, &arm_data.bindings_map, None);
1712 }
1713
1714 // insert bindings into the lllocals map and add cleanups
1715 bcx = insert_lllocals(bcx, &arm_data.bindings_map, BindLocal, true);
1716
1717 bcx = controlflow::trans_block(bcx, &arm_data.arm.body, dest);
1718 bcx = trans_block_cleanups(bcx, block_cleanups(arm_data.bodycx));
1719 arm_cxs.push(bcx);
1720 }
1721
1722 bcx = controlflow::join_blocks(scope_cx, arm_cxs);
1723 return bcx;
1724
1725 fn mk_fail(bcx: block, sp: span, msg: @str,
1726 finished: @mut Option<BasicBlockRef>) -> BasicBlockRef {
1727 match *finished { Some(bb) => return bb, _ => () }
1728 let fail_cx = sub_block(bcx, "case_fallthrough");
1729 controlflow::trans_fail(fail_cx, Some(sp), msg);
1730 *finished = Some(fail_cx.llbb);
1731 return fail_cx.llbb;
1732 }
1733 }
1734
1735 pub enum IrrefutablePatternBindingMode {
1736 // Stores the association between node ID and LLVM value in `lllocals`.
1737 BindLocal,
1738 // Stores the association between node ID and LLVM value in `llargs`.
1739 BindArgument
1740 }
1741
1742 // Not match-related, but similar to the pattern-munging code above
1743 pub fn bind_irrefutable_pat(bcx: block,
1744 pat: @ast::pat,
1745 val: ValueRef,
1746 make_copy: bool,
1747 binding_mode: IrrefutablePatternBindingMode)
1748 -> block {
1749 let _icx = push_ctxt("match::bind_irrefutable_pat");
1750 let ccx = bcx.fcx.ccx;
1751 let mut bcx = bcx;
1752
1753 // Necessary since bind_irrefutable_pat is called outside trans_match
1754 match pat.node {
1755 ast::pat_ident(_, _, ref inner) => {
1756 if pat_is_variant_or_struct(bcx.tcx().def_map, pat) {
1757 return bcx;
1758 }
1759
1760 if make_copy {
1761 let binding_ty = node_id_type(bcx, pat.id);
1762 let datum = Datum {val: val, ty: binding_ty,
1763 mode: ByRef(RevokeClean)};
1764 let scratch = scratch_datum(bcx, binding_ty, false);
1765 datum.copy_to_datum(bcx, INIT, scratch);
1766 match binding_mode {
1767 BindLocal => {
1768 bcx.fcx.lllocals.insert(pat.id, scratch.val);
1769 }
1770 BindArgument => {
1771 bcx.fcx.llargs.insert(pat.id, scratch.val);
1772 }
1773 }
1774 add_clean(bcx, scratch.val, binding_ty);
1775 } else {
1776 match binding_mode {
1777 BindLocal => {
1778 bcx.fcx.lllocals.insert(pat.id, val);
1779 }
1780 BindArgument => {
1781 bcx.fcx.llargs.insert(pat.id, val);
1782 }
1783 }
1784 }
1785
1786 for inner.iter().advance |inner_pat| {
1787 bcx = bind_irrefutable_pat(
1788 bcx, *inner_pat, val, true, binding_mode);
1789 }
1790 }
1791 ast::pat_enum(_, ref sub_pats) => {
1792 match bcx.tcx().def_map.find(&pat.id) {
1793 Some(&ast::def_variant(enum_id, var_id)) => {
1794 let repr = adt::represent_node(bcx, pat.id);
1795 let vinfo = ty::enum_variant_with_id(ccx.tcx,
1796 enum_id,
1797 var_id);
1798 let args = extract_variant_args(bcx,
1799 repr,
1800 vinfo.disr_val,
1801 val);
1802 for sub_pats.iter().advance |sub_pat| {
1803 for args.vals.iter().enumerate().advance |(i, argval)| {
1804 bcx = bind_irrefutable_pat(bcx,
1805 sub_pat[i],
1806 *argval,
1807 make_copy,
1808 binding_mode);
1809 }
1810 }
1811 }
1812 Some(&ast::def_fn(*)) |
1813 Some(&ast::def_struct(*)) => {
1814 match *sub_pats {
1815 None => {
1816 // This is a unit-like struct. Nothing to do here.
1817 }
1818 Some(ref elems) => {
1819 // This is the tuple struct case.
1820 let repr = adt::represent_node(bcx, pat.id);
1821 for elems.iter().enumerate().advance |(i, elem)| {
1822 let fldptr = adt::trans_field_ptr(bcx, repr,
1823 val, 0, i);
1824 bcx = bind_irrefutable_pat(bcx,
1825 *elem,
1826 fldptr,
1827 make_copy,
1828 binding_mode);
1829 }
1830 }
1831 }
1832 }
1833 Some(&ast::def_static(_, false)) => {
1834 bcx = bind_irrefutable_pat(bcx, pat, val, make_copy,
1835 binding_mode);
1836 }
1837 _ => {
1838 // Nothing to do here.
1839 }
1840 }
1841 }
1842 ast::pat_struct(_, ref fields, _) => {
1843 let tcx = bcx.tcx();
1844 let pat_ty = node_id_type(bcx, pat.id);
1845 let pat_repr = adt::represent_type(bcx.ccx(), pat_ty);
1846 do expr::with_field_tys(tcx, pat_ty, None) |discr, field_tys| {
1847 for fields.iter().advance |f| {
1848 let ix = ty::field_idx_strict(tcx, f.ident, field_tys);
1849 let fldptr = adt::trans_field_ptr(bcx, pat_repr, val,
1850 discr, ix);
1851 bcx = bind_irrefutable_pat(bcx,
1852 f.pat,
1853 fldptr,
1854 make_copy,
1855 binding_mode);
1856 }
1857 }
1858 }
1859 ast::pat_tup(ref elems) => {
1860 let repr = adt::represent_node(bcx, pat.id);
1861 for elems.iter().enumerate().advance |(i, elem)| {
1862 let fldptr = adt::trans_field_ptr(bcx, repr, val, 0, i);
1863 bcx = bind_irrefutable_pat(bcx,
1864 *elem,
1865 fldptr,
1866 make_copy,
1867 binding_mode);
1868 }
1869 }
1870 ast::pat_box(inner) | ast::pat_uniq(inner) => {
1871 let llbox = Load(bcx, val);
1872 let unboxed = GEPi(bcx, llbox, [0u, abi::box_field_body]);
1873 bcx = bind_irrefutable_pat(bcx,
1874 inner,
1875 unboxed,
1876 true,
1877 binding_mode);
1878 }
1879 ast::pat_region(inner) => {
1880 let loaded_val = Load(bcx, val);
1881 bcx = bind_irrefutable_pat(bcx,
1882 inner,
1883 loaded_val,
1884 true,
1885 binding_mode);
1886 }
1887 ast::pat_wild | ast::pat_lit(_) | ast::pat_range(_, _) |
1888 ast::pat_vec(*) => ()
1889 }
1890 return bcx;
1891 }