This was just a quick example that I threw together. I don't think the performance is related to the ReDim function, however.

I replaced "'KIX' with a file of about 25Kbytes using the FileIO function. The file load consistently takes 15-16ms to load 282 lines / 25328 chars into an array of lines.

Running this with the original logic, performing a ReDim for each character took about 3.7-3.9 seconds.

I modified the function to redim only when the counter exceeds the array size, and with a ReDim value of 15000, it resizes the array only 3 times (twice during processing and once at the end to trim to actual size. This was a common method in earlier versions where array management seemed to require a lot of overhead.

So - I converted the array of source data to a single large string ($D = Join($Array, @CRLF) and passed that to the function, timing with millisecond accuracy. The process took 3.766 seconds - consistently +/- 20ms.

I then changed the logic to not use SubStr() on such a large string. Passing the 282 lines to the function one at a time, and even including the overhead of adding the small arrays to a single large array as originally done took just 235ms, with +/- 10ms consistency.

Here's my test code - you'll need the FileIO and TimeDiff functions and either insert them or change the Call reference path.
 Code:
Break On

Call '%KIXLIBPATH%\FileIO.kxf'
Call '%KIXLIBPATH%\TimeDiff.kxf'

$S = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)
$D = FileIO('C:\Temp\Test.txt', 'R')
$E = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)

'Data Load -' ?
' Size: ' Len(Join($D, @CRLF)) ?
'Start: ' $S ?
'  End: ' $E ?
'Total: ' TimeDiff($S, $E, ,1) ? ?


$S = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)
$A = TxtToAry(Join($D, @CRLF))
$E = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)
'Conversion by Single Text Block-' ?
'Start: ' $S ?
'  End: ' $E ?
'Total: ' TimeDiff($S, $E, ,1) ?
'Array: ' 1 + Ubound($A) ? ?

$ = FileIO('C:\Temp\test1.txt', 'W', $A)

Dim $aF
$S = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)
$C = -1
For Each $Line in $D
  $A = TxtToAry($Line + @CRLF)
  $T = UBound($A)
  $F = UBound($aF)
  ReDim Preserve $aF[1 + $F + $T]
  For $I = 0 to $T
    $C = $C + 1
    $aF[$C] = $A[$I]
  Next
Next
$E = @DATE + ' ' + @Time + '.' + Right('000' + @MSECS, 3)

'Conversion by Line -' ?
'Lines: ' 1 + UBound($D) ?
'Start: ' $S ?
'  End: ' $E ?
'Total: ' TimeDiff($S, $E, ,1) ?
'Array: ' 1 + Ubound($aF) ? ?

$ = FileIO('C:\Temp\test2.txt', 'W', $aF)


; Convert a text string to an array of characters
; Glenn Barnas
Function TxtToAry($_S)

  Dim $_P, $_C
  $_C = -1
  For $_P = 1 to Len($_S)
    $_C = $_C + 1
    If $_C >= UBound($TxtToAry)
      ReDim Preserve $TxtToAry[100 + $_C]
    EndIf
    $TxtToAry[$_P - 1] = SubStr($_S, $_P, 1)
  Next

  ReDim Preserve $TxtToAry[$_C]

  Exit 0

EndFunction
Note that the Conversion by Line method returns an array that includes an extra CRLF - that's because the initial method uses Join(Array,@CRLF), which doesn't add a final CRLF to the text, while the other method adds a CRLF after every line, including the last.

Clearly, it seems that the performance difference comes from the SubStr function handling very large strings and not from the ReDim.

Glenn
_________________________
Actually I am a Rocket Scientist! \:D