
On 01/16/2012 11:11 PM, Simon Glass wrote:
From: Rakesh Iyer riyer@nvidia.com
Add support for internal matrix keyboard controller for Nvidia Tegra platforms. This driver uses the fdt decode function to obtain its key codes.
...
+static uchar *create_keymap(u32 *data, int len, int map_keycode, int *pos)
...
entry = row * KBC_MAX_COL + col;
map[entry] = key_code;
This needs to be range-checked.
if (pos && map_keycode == key_code)
*pos = entry;
}
return map;
+}
+static int fdt_decode_kbc(const void *blob, int node, struct keyb *config)
...
/* Name needs to match "1,<type>keymap" */
debug("%s: property '%s'\n", __func__, name);
if (strncmp(name, "1,", 2) || len < 8 ||
strcmp(name + len - 6, "keymap"))
continue;
"linux," not "l,". Why not just strcmp against the two values simply and directly; the above matches against totally bogus stuff like "l,bogus-cruft-keymap".
len -= 8;
No need for that right?
+static int init_tegra_keyboard(void)
...
if (config.fn_keycode) {
if (input_add_table(&config.input, KEY_FN, -1,
config.fn_keycode, KBC_KEY_COUNT))
return -1;
}
I don't see anywhere that calls input_add_table() for config.plain_keycode.
+int drv_keyboard_init(void) +{
struct stdio_dev dev;
if (input_init(&config.input, 0)) {
debug("%s: Cannot set up input\n", __func__);
return -1;
}
config.input.read_keys = tegra_kbc_check;
Don't you want to set up config.input.* before passing it to input_init() which might in theory use it?
diff --git a/include/tegra-kbc.h b/include/tegra-kbc.h
+#define KEY_IS_MODIFIER(key) ((key) >= KEY_FIRST_MODIFIER)
Shouldn't that be in the input layer or the file that defines the key codes?