]> git.proxmox.com Git - ceph.git/blame - ceph/src/boost/libs/range/doc/reference/adaptors/tokenized.qbk
bump version to 12.2.2-pve1
[ceph.git] / ceph / src / boost / libs / range / doc / reference / adaptors / tokenized.qbk
CommitLineData
7c673cae
FG
1[/
2 Copyright 2010 Neil Groves
3 Distributed under the Boost Software License, Version 1.0.
4 (See accompanying file LICENSE_1_0.txt or copy at http://www.boost.org/LICENSE_1_0.txt)
5/]
6[section:tokenized tokenized]
7
8[table
9 [[Syntax] [Code]]
10 [
11 [Pipe]
12 [
13 ``
14 rng | boost::adaptors::tokenized(regex)
15 rng | boost::adaptors::tokenized(regex, i)
16 rng | boost::adaptors::tokenized(regex, rndRng)
17 rng | boost::adaptors::tokenized(regex, i, flags)
18 rng | boost::adaptors::tokenized(regex, rndRng, flags)
19 ``
20 ]
21 ]
22 [
23 [Function]
24 [
25 ``
26 boost::adaptors::tokenize(rng, regex)
27 boost::adaptors::tokenize(rng, regex, i)
28 boost::adaptors::tokenize(rng, regex, rndRng)
29 boost::adaptors::tokenize(rng, regex, i, flags)
30 boost::adaptors::tokenize(rng, regex, rndRng, flags)
31 ``
32 ]
33 ]
34]
35
36* [*Precondition:]
37 * Let `T` denote `typename range_value<decltype(rng)>::type`, then `regex` has the type `basic_regex<T>` or is implicitly convertible to one of these types.
38 * `i` has the type `int`.
39 * the `value_type` of `rndRng` is `int`.
40 * `flags` has the type `regex_constants::syntax_option_type`.
41* [*Returns:] A range whose iterators behave as if they were the original iterators wrapped in `regex_token_iterator`. The first iterator in the range would be constructed by forwarding all the arguments of `tokenized()` to the `regex_token_iterator` constructor.
42* [*Throws:] Whatever constructing and copying equivalent `regex_token_iterator`s might throw.
43* [*Range Category:] __random_access_range__
44* [*Range Return Type:] `boost::tokenized_range<decltype(rng)>`
45* [*Returned Range Category:] __random_access_range__
46
47[section:tokenized_example tokenized_example]
48[import ../../../test/adaptor_test/tokenized_example.cpp]
49[tokenized_example]
50[endsect]
51
52This would produce the output:
53``
54a
55b
56c
57d
58e
59f
60g
61hijklmnopqrstuvwxyz
62
63``
64
65[endsect]
66
67