
On 06/10/2018 03:25 PM, Bin Meng wrote:
Since commit bb0bb91cf0aa ("efi_stub: Use efi_uintn_t"), EFI x86 64-bit payload does not work anymore. The call to GetMemoryMap() in efi_stub.c fails with return code EFI_INVALID_PARAMETER. Since the payload itself is still 32-bit U-Boot
Above you say 64-bit payload and now you say 32-bit?
Why don't you compile U-Boot as 64-bit? How do you want to load a 64bit Linux EFI stub from an 32-bit EFI implementation in U-Boot?
, efi_uintn_t gets wrongly interpreted as int, but it should actually be long in a 64-bit EFI environment.
Fixes: bb0bb91cf0aa ("efi_stub: Use efi_uintn_t") Signed-off-by: Bin Meng bmeng.cn@gmail.com
include/efi_api.h | 4 ++++ 1 file changed, 4 insertions(+)
diff --git a/include/efi_api.h b/include/efi_api.h index 64c27e4..d1158de 100644 --- a/include/efi_api.h +++ b/include/efi_api.h @@ -28,7 +28,11 @@ enum efi_timer_delay { EFI_TIMER_RELATIVE = 2 };
+#if defined(CONFIG_EFI_STUB_64BIT) && defined(EFI_STUB) +#define efi_uintn_t unsigned long +#else #define efi_uintn_t size_t
NAK
This change will create a lot of build warnings if EFI_STUB and EFI_LOADER are both configured.
Could you, please, explain under which compiler settings size_t and unsigned long have a different number of bits?
Obviously we have the EFI API exposed by U-Boot which has the bitness of U-Boot.
If you want to consume an EFI API of another bitness I suggest that you create separate interface definitions.
Best regards
Heinrich Schuchardt
+#endif typedef uint16_t *efi_string_t;
#define EVT_TIMER 0x80000000