Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KAN intuition for setting the hidden layers #14

Closed
marcopeix opened this issue May 8, 2024 · 1 comment
Closed

KAN intuition for setting the hidden layers #14

marcopeix opened this issue May 8, 2024 · 1 comment

Comments

@marcopeix
Copy link

Hello,

Is there a rule of thumb or intuition for setting the layers_hidden parameter? I'm using it for time series, and I use [input_size, 10, horizon]. The 10 is arbitrary, and taken from the MNIST example, but do you have a suggestion on setting these for best performance?

@hoangthangta
Copy link

So far, I know that the number of hidden neurons can affect performance. In the MNIST example, I tried with the size 128, 256, and 512 instead of 64. However, there is a limit. Not sure with other parameters.

model = KAN([28 * 28, 256, 10])

Repository owner locked and limited conversation to collaborators May 20, 2024
@Blealtan Blealtan converted this issue into discussion #33 May 20, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Projects
None yet
Development

No branches or pull requests

3 participants