09 January 2025 | Challenge 299 |
Sparse Progress
Task 2: Word Search
Submitted by: Mohammad Sajid Anwar
You are given a grid of characters and a string.
Write a script to determine whether the given string can be found in the given grid of characters. You may start anywhere and take any orthogonal path, but may not reuse a grid cell.
Example 1
Input: @chars = (['A', 'B', 'D', 'E'],
['C', 'B', 'C', 'A'],
['B', 'A', 'A', 'D'],
['D', 'B', 'B', 'C'])
$str = 'BDCA'
Output: true
Example 2
Input: @chars = (['A', 'A', 'B', 'B'],
['C', 'C', 'B', 'A'],
['C', 'A', 'A', 'A'],
['B', 'B', 'B', 'B'])
$str = 'ABAC'
Output: false
Example 3
Input: @chars = (['B', 'A', 'B', 'A'],
['C', 'C', 'C', 'C'],
['A', 'B', 'A', 'B'],
['B', 'B', 'A', 'A'])
$str = 'CCCAA'
Output: true
Solution
This is a follow-up to my post for challenge 299.
Recap: We take the elements of the letter square as vertices of an undirected graph and create an adjacency matrix representing the neighborhood between the elements of the square.
What does this mean? Consider a letter grid having a size of \(100 \times 100\). Then we have \(10{,}000\) vertices and an adjacency matrix of \(10{,}000 \times 10{,}000\) containing \(100{,}000{,}000\) elements. Then we multiply such a monster with a vector with the length of \(10{,}000\). This certainly doesn’t look sane. Though I’m sure it produces correct results, it is highly questionable it is practically usable. We’ll see.
I had some improvements in mind, especially the usage of sparse matrices.
Luckily PDL
comes with PDL::CCS
as an implementation.
In the case of our \(100 \times 100\) grid we have only \(19{,}800\) non-zero elements in the \(10{,}000 \times 10{,}000\) adjacency matrix.
Can we handle it that way?
Refined Procedure
As in the prequel we have a graph \((V, E)\) represented by an adjacency matrix \(A = (a_{ij})\) where
\[a_{ij} = \begin{cases} 1 & \text{if } (i, j) \in E \\ 0 & \text{otherwise} \end{cases}\]The Each vertex \(v \in V\) has an attached label \(l(v)\), and we have a set of characteristic vectors \(\chi_L = (\chi_i^L)\) for each label \(L\) as
\[\chi_i^L = \begin{cases} 1 & \text{if } l(i) = L \\ 0 & \text{otherwise} \end{cases}\]Furthermore we have a sequence of target labels \(l = (l_1,l_2,\ldots,l_k)\).
We may ignore all vertices that have a label distinct from all labels in \(l\).
In general, for a vector \(b\) representing a set of vertices, the product with the adjacency matrix \(Ab\) is non-zero for all vertices that are neighbors to the vertices in \(b\). Repeating this procedure leads to all vertices that are reachable with a walk of length \(n\) starting from \(b\) as \(A^nb\).
If - as in this task - the neighbor vertices are restricted to a given label as defined by a vertex set \(c\), the valid neighbors of \(b\) are \(c\cdot(Ab)\). Again, \(x\cdot y\) denotes the element-wise vector product.
Thus we define walk targets \(p_i\) at position \(i \text{ with } 1 \le i \le k\) as:
\[\begin{align*} p_1 &= \chi_{l_1} \\ p_i &= \chi_{l_i} \cdot (A p_{i - 1}) \text{ for } i > 1 \end{align*}\]as the set of vertices that are reachable along a walk \((v_1,\ldots,v_i)\) of length \(i - 1\) having labels \(l(v_j) = l_j\).
Should any of \(p_i\) be null/empty, then there is no walk matching the given label sequence in the graph and thus there is no such path.
If each of the vertices in \(p\) appears only once, then each walk along \(p\) is a path and we succeed.
At this point we may have solved the problem or may have proven that the word is missing in the grid, though “hard” tasks remain unsolved. But the first part of this procedure revealed some important information: With \(p_k\) we have all possible end vertices of such a walk and we may use these as start vertices for a backward path along the labels \((l_k,\ldots,l_1)\).
Searching for a path is much more extensive than finding a walk. By applying the first part of the process the search space for paths will be significantly reduced in most cases.
To simplify the description of the following process, we now reverse the labels as \(l_i \rightarrow l_{k - i + 1}\) and \(p_i \rightarrow p_{k - i + 1}\).
So we are given a problem \(P = (A, (p_1,\ldots,p_k))\) where \(A\) is an adjacency matrix and \(p_i\) is a sequence of indicator vectors specifying the permitted vertices at each step.
If \(k = 1\), i.e. there is no step to be performed, we succeed.
Otherwise we loop over the individual vertices \(\hat{v} \in p_1\):
- Find the allowed neighbor vertices of \(\hat{v}\) as \(p^* = p_2 \cdot (A \hat{v})\).
- If \(p^*\) is null/empty, continue with the next \(\hat{v}\).
- Create a new sequence of indicator vectors \((q_1,\ldots,q_{k-1}) = (1 - \hat{v})\cdot(p^*, p_3,\ldots,p_k)\) that has any further appearances of \(\hat{v}\) disabled in the path.
- Recursively solve the problem \(P(A, (q_1,\ldots,q_{k-1}))\).
- When the sub-problem succeeds, the current problem succeeds, too.
- Otherwise we continue looping with the next \(\hat{v} \in p_1\).
- If we fall to the end of the loop, the problem fails.
This procedure may be optimized by processing all unique vertices in \(p_1\) at once as there is actually no need to delete them for the rest of the part.
Preparation
Again, we need to do construct the initial adjacency matrix representing the allowed “orthogonal paths” as given by the \(M \times N\) grid. Now we must not build the full dense matrix but specify its non-zero elements instead.
Here we may spend some more effort for optimization. We only need to consider vertices having one of the target labels and we may reduce the adjacency matrix to those pairs of vertices where both labels are contained in the set of target labels.
Then we construct all characteristic vectors \(\chi_l\) and the resulting walk targets and proceed as described above.
Benchmarks
We build a \(100 \times 100\) matrix containing random letters from different set sizes.
For benchmarking I chose choroba’s solution.
The benchmark is performed on a series of different random arrangements.
Some insights are:
- The speed of the scan depends on the sample. An early hit makes it faster, a failing search slows it down. The graph approach does not vary that much. It depends more on the search word, were unique letters are handled constantly faster than non-unique letters.
- The worst case for the graph approach is the search for non-unique labels in case a walk was found, where the scan approach usually runs faster than the graph.
- The average over multiple grids depends mainly on the degree of difficulty. A smaller set of letters is preferred for the scan, larger sets with less found words is preferred for the graph approach.
Benchmark for a \(100 \times 100\) grid, search word ABCBD
, character sets 10 (A .. J
) and 20 (A .. T
):
10:
Rate graph scan
graph 260/s -- -57%
scan 600/s 131% --
20:
Rate scan graph
scan 163/s -- -59%
graph 395/s 142% --
The Code
Besides the usage of PDL::CCS
sparse matrices, there are two
key measures leading to the observed speed for the graph approach:
- The omission of all characters in the grid different from all the search word’s characters.
- The usage of the low-level matrix multiplication provided by
PDL::CCS::MatrixOps::ccs_matmult2d_zdd
. Thanks to moocow-the-bovine for quickly fixing an issue inPDL::CCS
related to this project and pointing me to the low-level functions.
Each measure resulted in a performance factor of 2. Without any of these, the scan approach would have been faster in almost all circumstances.
use strict;
use warnings;
use PDL;
use PDL::NiceSlice;
use PDL::Char;
use PDL::CCS;
use experimental 'signatures';
our $verbose;
# Moving forward along a walk by calculating c *= A x b. "A" as an
# PDL::CCS::Nd object must be handed as "otherPar". Using fast low-level
# matrix product from PDL::CCS::MatrixOps. Result is a binary vector.
broadcast_define 'x_forward(b(n);c(n)), NOtherPars => 1', over {
my ($b, $c, $a) = @_;
my $p = zeroes long, 1, $c->dim(0);
ccs_matmult2d_zdd($a->_whichND, $a->_nzvals,
$b->dummy(0), $p);
$p->inplace->hclip(1);
$c *= $p((0));
die "result is null" if !$c->any;
};
sub find_word ($matrix, $word) {
my $m = PDL::Char->new($matrix);
my $w = PDL::Char->new([$word]);
my $chars = $w->long->clump(2);
# create a layered mask for word's characters in the grid
my $mw = $m((0))->long->dummy(2, $chars->dim(0))
->reorder(2, 0, 1) == $chars;
# combine layers
my $mask = $mw->orover;
# side action from this state:
# flatten matrix structure in the layers and drop all empty slots
my $p = $mw->clump(1, 2)->reorder(1, 0)->(which $mask)->sever;
# continue with mask:
# set unmasked elements to BAD
$mask->inplace->setvaltobad(0);
# enumerate vertices
my $asize = $mask->ngood->sclr;
$mask->where($mask->isgood) .= sequence long, $asize;
# create row and column pairs
my $r = $mask(0:-2)->clump(2)->glue(1, $mask(1:-1)->clump(2));
my $c = $mask(,0:-2)->clump(2)->glue(1, $mask(,1:-1)->clump(2));
# concat pairs
my $rc = $r->glue(0, $c);
# find indices of good pairs
my $igood = which !$rc->reorder(1, 0)->nbadover;
($verbose && say("no good pairs") && 0) ||
return 0 if $igood->isempty;
# pick good pairs
my $good = $rc($igood)->reorder(1, 0);
# symmetrize
my $which = $good->glue(1, $good(-1:0));
# create sparse adjacency matrix from pairs
my $adj = PDL::CCS::Nd->newFromWhich($which, ones(long, $which->dim(1)),
pdims => [$asize, $asize], missing => 0);
# try to find a walk along the word's characters, dropping all
# vertices that are not on a forward walk. Broadcast over layers.
eval {
x_forward($p(,0:-2), $p(,1:-1), $adj);
1;
} || return 0;
say "found walk" if $verbose;
($verbose && say("all walks are paths") && 0) ||
return 1 if $p->xchg(0, 1)->sumover->max == 1;
# search backward for a path
find_path($adj, $p(,-1:0));
}
# search for a path: start paths from unique vertices or each
# individual non-unique vertex, remove the starting vertex and
# recursively try to reach the target
sub find_path($adj, $p) {
# count start vertex appearances
my $vc = $p->xchg(0, 1)->sumover * $p(,(0));
# create matrix with one row for all unique vertices at once and
# one row for each non-unique vertex
my $uniq = which $vc == 1;
my $nu = which $vc > 1;
my $q = zeroes long, !$uniq->isempty + $nu->nelem, $vc->dim(0);
$q(0,$uniq) .= 1 unless $uniq->isempty;
$q(-$nu->nelem:-1,$nu)->diagonal(0, 1) .= 1 unless $nu->isempty;
for my $p0 ($q->xchg(0, 1)->dog) {
# pick second to last column from p
my $p1 = $p(,1:-1)->sever;
# step forward
eval {
x_forward($p0, $p1(,(0)), $adj);
1;
} || next;
return 1 if $p->dim(1) == 2;
# delete selected vertices
$p1(which $p0) .= 0;
return 1 if find_path($adj, $p1);
}
0;
}
Conclusion
This work has no practical purpose.
I just wanted to find out if the idea presented in my solution to task 2 from challenge 299 could be extended to larger grids.
It was just another exercise in PDL
and especially in PDL::CCS
.
See the full solution to task 2.
If you have a question about this post or if you like to comment on it, feel free to open an issue in my github repository.