private pure subroutine tokenize_into_first_last(string, set, first, last)
Parse a string into tokens. Each character in set is a token delimiter.
first is allocated with the lower bound equal to one and the upper bound equal to the number of tokens in string.
Each element is assigned, in array element order, the starting position of each token in string, in the order found.
last is allocated with the lower bound equal to one and the upper bound equal to the number of tokens in string.
Each element is assigned, in array element order, the ending position of each token in string, in the order found.
This subroutine implements the tokenize intrinsic procedure as defined in the Fortran 2023 language standard
(Section 16.9.210). We implement it ourselves because the compiler support may take years to become widespread.
(KCW, 2025-10-29)
Nodes of different colours represent the following:
Solid arrows point from a procedure to one which it calls. Dashed
arrows point from an interface to procedures which implement that interface.
This could include the module procedures in a generic interface or the
implementation in a submodule of an interface in a parent module.
Nodes of different colours represent the following:
Solid arrows point from a procedure to one which it calls. Dashed
arrows point from an interface to procedures which implement that interface.
This could include the module procedures in a generic interface or the
implementation in a submodule of an interface in a parent module.
Variables
Type
Visibility
Attributes
Name
Initial
integer,
private
::
l
integer,
private
::
n
integer,
private
::
pos
integer,
private
::
pos_end(len(string)+1)
integer,
private
::
pos_start(len(string)+1)
Source Code
pure subroutine tokenize_into_first_last(string,set,first,last)character(*),intent(in)::string,setinteger,allocatable,intent(out)::first(:),last(:)integer::pos_start(len(string)+1),pos_end(len(string)+1)integer::l,n,posl=len(string)n=0pos=0do while(pos<l+1)n=n+1pos_start(n)=pos+1call split(string,set,pos)pos_end(n)=pos-1end do allocate(first(n),last(n))first(:)=pos_start(1:n)last(:)=pos_end(1:n)end subroutine tokenize_into_first_last