
Hi Alper,
On Thu, 3 Mar 2022 at 14:16, Alper Nebi Yasak alpernebiyasak@gmail.com wrote:
On 24/02/2022 02:00, Simon Glass wrote:
At present the fit implementation creates the output tree while scanning the FIT description. Then it updates the tree later when the data is known.
This works, but is a bit confusing, since it requires mixing the scanning code with the generation code, with a fix-up step at the end.
It is actually possible to do this in two phases, one to scan everything and the other to generate the FIT. Thus the FIT is generated in one pass, when everything is known.
Doing it in one go makes sense to me as well. In general I like the way distinct processing actions/steps are being split into their own blocks or so, and I think this helps move things toward that.
OK good.
Update the code accordingly. The only functional change is that the 'data' property for each node are now last instead of first, which is really a more natural position. Update the affected test to deal with this.
One wrinkle is that the calculated properties (image-pos, size and offset) are now added before the FIT is generated. so we must filter these out when copying properties from the binman description to the FIT.
Most of the change here is splitting out some of the code from the ReadEntries() implementation into _BuildInput(). So despite the large diff, most of the code is the same. It is not feasible to split this patch up, so far as I can tell.
Signed-off-by: Simon Glass sjg@chromium.org
Changes in v2:
- Add new patch to refactor fit to generate output at the end
tools/binman/etype/fit.py | 178 ++++++++++++++----------- tools/binman/ftest.py | 13 +- tools/binman/test/224_fit_bad_oper.dts | 2 - 3 files changed, 109 insertions(+), 84 deletions(-)
Reviewed-by: Alper Nebi Yasak alpernebiyasak@gmail.com
I still wrote some weird ideas below, mostly for the future, since this patch is mostly moving code around which is fine as is.
diff --git a/tools/binman/etype/fit.py b/tools/binman/etype/fit.py index 2d4c5f6545..61c72780e9 100644 --- a/tools/binman/etype/fit.py +++ b/tools/binman/etype/fit.py @@ -209,6 +209,81 @@ class Entry_fit(Entry_section): return oper
def ReadEntries(self):
def _add_entries(base_node, depth, node):
"""Add entries for any nodes that need them
Args:
base_node: Base Node of the FIT (with 'description' property)
depth: Current node depth (0 is the base 'fit' node)
node: Current node to process
He we only need to provide binman entries which are used to define
He -> Here ?
the 'data' for each image. We create an entry_Section for each.
"""
rel_path = node.path[len(base_node.path):]
in_images = rel_path.startswith('/images')
has_images = depth == 2 and in_images
if has_images:
# This node is a FIT subimage node (e.g. "/images/kernel")
# containing content nodes. We collect the subimage nodes and
# section entries for them here to merge the content subnodes
# together and put the merged contents in the subimage node's
# 'data' property later.
entry = Entry.Create(self.section, node, etype='section')
entry.ReadNode()
I plan to change 'self.section' to 'self' here later, fixes extracting wrong contents for FIT subentries.
OK
# The hash subnodes here are for mkimage, not binman.
entry.SetUpdateHash(False)
self._entries[rel_path] = entry
I also plan to change this to a single-level node name instead of the relative path, lets 'binman extract fit/u-boot' etc. run at all.
That would be good.
for subnode in node.subnodes:
_add_entries(base_node, depth + 1, subnode)
_add_entries(self._node, 0, self._node)
I think it's especially visible here what I meant by switching away from recursion: this recurses through every node but only does anything on immediate subnodes of "/images" (for now?).
Fair enough.
- def BuildSectionData(self, required):
"""Build FIT entry contents
This adds the 'data' properties to the input ITB (Image-tree Binary)
then runs mkimage to process it.
Args:
required: True if the data must be present, False if it is OK to
return None
I forgot to handle 'required' while converting FIT to section...
Returns:
Contents of the section (bytes)
"""
data = self._BuildInput()
uniq = self.GetUniqueName()
input_fname = tools.get_output_filename('%s.itb' % uniq)
output_fname = tools.get_output_filename('%s.fit' % uniq)
tools.write_file(input_fname, data)
tools.write_file(output_fname, data)
args = {}
ext_offset = self._fit_props.get('fit,external-offset')
if ext_offset is not None:
args = {
'external': True,
'pad': fdt_util.fdt32_to_cpu(ext_offset.value)
}
if self.mkimage.run(reset_timestamp=True, output_fname=output_fname,
**args) is None:
I have an idea for the far future, to let /image/* nodes sometimes be Entry_collection to handle external offsets in binman so we can take mkimage completely out of this, but no clue how feasible/desirable that end goal is.
Well I would really prefer to avoid duplicating mkimage functionality. I think it is something to look at when binman is more widespread, but as you can see we are still sorting out the FIT issues.
# Bintool is missing; just use empty data as the output
self.record_missing_bintool(self.mkimage)
return tools.get_bytes(0, 1024)
return tools.read_file(output_fname)
- def _BuildInput(self):
"""Finish the FIT by adding the 'data' properties to it
Arguments:
fdt: FIT to update
Returns:
New fdt contents (bytes)
""" def _process_prop(pname, prop): """Process special properties
@@ -236,9 +311,15 @@ class Entry_fit(Entry_section): val = val[1:].replace('DEFAULT-SEQ', str(seq + 1)) fsw.property_string(pname, val) return
elif pname.startswith('fit,'):
# Ignore these, which are commands for binman to process
return
elif pname in ['offset', 'size', 'image-pos']:
# Don't add binman's calculated properties
return
This is one of the things I was thinking of doing, thanks. I encountered the same issue when replacing a FIT entry with the same f"{uniq}.fit" that was used to build it, will try adding a test for that later.
Yes good.
fsw.property(pname, prop.bytes)
def _scan_gen_fdt_nodes(subnode, depth, in_images):
def _gen_fdt_nodes(subnode, depth, in_images): """Generate FDT nodes This creates one node for each member of self._fdts using the
[...]
diff --git a/tools/binman/test/224_fit_bad_oper.dts b/tools/binman/test/224_fit_bad_oper.dts index cee801e2ea..8a8014ea33 100644 --- a/tools/binman/test/224_fit_bad_oper.dts +++ b/tools/binman/test/224_fit_bad_oper.dts @@ -21,7 +21,5 @@ }; }; };
fdtmap {
};
This looked unrelated at first, but as I understand it, changing the order of things made the bad-operation error happen later and exposed a breakage from the fdtmap entry trying to parse the mock dtb data.
Yes, that's it.
Regards, Simon