{"payload":{"feedbackUrl":"https://github.com/orgs/community/discussions/53140","repo":{"id":747527985,"defaultBranch":"main","name":"mlx-examples","ownerLogin":"davidkoski","currentUserCanPush":false,"isFork":true,"isEmpty":false,"createdAt":"2024-01-24T05:29:45.000Z","ownerAvatar":"https://avatars.githubusercontent.com/u/46639364?v=4","public":true,"private":false,"isOrgOwned":false},"refInfo":{"name":"","listCacheKey":"v0:1706642041.0","currentOid":""},"activityList":{"items":[{"before":"297a908e3db205a5d35acf806a30addbb2515788","after":"abcd8918518d071169fd6185b33f474c2f07956e","ref":"refs/heads/main","pushedAt":"2024-04-24T14:36:47.000Z","pushType":"push","commitsCount":28,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"Add support for phi-3 (#712)\n\n* Add phi-3 modelling\r\n\r\n* fix rope scaling warning\r\n\r\n* add tests and update tuner utils\r\n\r\n* update name and remove sanitize\r\n\r\n* fix lora","shortMessageHtmlLink":"Add support for phi-3 (ml-explore#712)"}},{"before":"e56d9015ef210ef869ab74cf5727a8504efe99ac","after":"297a908e3db205a5d35acf806a30addbb2515788","ref":"refs/heads/main","pushedAt":"2024-03-27T23:45:32.000Z","pushType":"push","commitsCount":29,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"fix(mlx-lm): type hints in gguf.py (#621)","shortMessageHtmlLink":"fix(mlx-lm): type hints in gguf.py (ml-explore#621)"}},{"before":"a429263905af4044eb3a7d6412e19df54b390a73","after":"e56d9015ef210ef869ab74cf5727a8504efe99ac","ref":"refs/heads/main","pushedAt":"2024-03-12T15:23:30.000Z","pushType":"push","commitsCount":18,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"LoRA on all linear transformer block layers (#546)\n\n* Add --lora-all-linear option to apply LoRa to all linear transfer block layers\r\n\r\n* Moved to YAML config and added specification of rank & alpha\r\n\r\n* nits in conifg, more tests\r\n\r\n* nit\r\n\r\n* run tests for prs\r\n\r\n---------\r\n\r\nCo-authored-by: Awni Hannun ","shortMessageHtmlLink":"LoRA on all linear transformer block layers (ml-explore#546)"}},{"before":"97c09a863daae18ef662ab62b34be97534fcdf23","after":"a429263905af4044eb3a7d6412e19df54b390a73","ref":"refs/heads/main","pushedAt":"2024-03-02T06:25:54.000Z","pushType":"push","commitsCount":16,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"LlaVA in MLX (#461)\n\n* add: llava mlx first draft\r\n\r\n* add: weights comparision\r\n\r\n* add forward pass skeleton\r\n\r\n* update: now imports weights correctly\r\n\r\n* delete base\r\n\r\n* latest\r\n\r\n* adding config\r\n\r\n* fix: use config\r\n\r\n* add mlx config\r\n\r\n* feat: add image processor for llava processor\r\n\r\n* wip\r\n\r\n* feat: llava working example\r\n\r\n* chore: refactor generate script\r\n\r\n* chore: clean up\r\n\r\n* add: warning to user if no token despite using one\r\n\r\n* add: __call__ to LlavaModel\r\n\r\n* add: call to LlavaModel\r\n\r\n* update fp\r\n\r\n* clean up var names\r\n\r\n* update: native GeLU\r\n\r\n* Cleanup\r\n\r\n* update generate and readme\r\n\r\n* remove todo comment\r\n\r\n* rearrange tests\r\n\r\n* fix example code\r\n\r\n* nits in README\r\n\r\n* update readme\r\n\r\n* nit in readme\r\n\r\n* nits in README\r\n\r\n* chore(llava): refactor image embedding merging logic\r\n\r\n* min mlx version\r\n\r\n* nits in readmes\r\n\r\n* fix cli prompt, some nits\r\n\r\n* updates, slight simplify\r\n\r\n---------\r\n\r\nCo-authored-by: anchen \r\nCo-authored-by: Awni Hannun ","shortMessageHtmlLink":"LlaVA in MLX (ml-explore#461)"}},{"before":"614de6652faf7747737af991e34008618598da43","after":"97c09a863daae18ef662ab62b34be97534fcdf23","ref":"refs/heads/main","pushedAt":"2024-02-21T23:30:34.000Z","pushType":"push","commitsCount":44,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"bump version and include in package (#475)","shortMessageHtmlLink":"bump version and include in package (ml-explore#475)"}},{"before":null,"after":"8de00a18ef4852c6a4e17bbebce9f3bbdd85b866","ref":"refs/heads/mlxlm-tps-fix","pushedAt":"2024-01-30T19:14:01.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"fix token count computation -- produced wrong tps measurements","shortMessageHtmlLink":"fix token count computation -- produced wrong tps measurements"}},{"before":"8d35fb2c8f9b4eb004964053abe36c8edf489676","after":null,"ref":"refs/heads/mlxlm-tps-fix","pushedAt":"2024-01-30T19:12:14.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"}},{"before":"cad7da4036abb49e6e2391a4d7e31f499753ac2e","after":null,"ref":"refs/heads/mistral-add-safetensors","pushedAt":"2024-01-30T19:12:13.000Z","pushType":"branch_deletion","commitsCount":0,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"}},{"before":"ab91ac1075b3832a508dba2fae151112dd8b4029","after":"614de6652faf7747737af991e34008618598da43","ref":"refs/heads/main","pushedAt":"2024-01-30T19:10:26.000Z","pushType":"push","commitsCount":9,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"chore(mlx-lm): add reset lora layers helper (#377)\n\n* chore(mlx-lm): add reset lora layers helper\r\n\r\n* chore: rename the func\r\n\r\n* chore: update docstring\r\n\r\n* Update llms/mlx_lm/tuner/utils.py\r\n\r\nCo-authored-by: Awni Hannun \r\n\r\n---------\r\n\r\nCo-authored-by: Awni Hannun ","shortMessageHtmlLink":"chore(mlx-lm): add reset lora layers helper (ml-explore#377)"}},{"before":null,"after":"8d35fb2c8f9b4eb004964053abe36c8edf489676","ref":"refs/heads/mlxlm-tps-fix","pushedAt":"2024-01-30T19:10:14.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"}},{"before":null,"after":"cad7da4036abb49e6e2391a4d7e31f499753ac2e","ref":"refs/heads/mistral-add-safetensors","pushedAt":"2024-01-24T05:36:06.000Z","pushType":"branch_creation","commitsCount":0,"pusher":{"login":"davidkoski","name":"David Koski","path":"/davidkoski","primaryAvatarUrl":"https://avatars.githubusercontent.com/u/46639364?s=80&v=4"},"commit":{"message":"add safetensors option for mistral weight conversion\n\n- add an option to write and read weights in safetensor format\n- in theory these weights load faster, though in mesaurement it seems to be less than 50ms difference","shortMessageHtmlLink":"add safetensors option for mistral weight conversion"}}],"hasNextPage":false,"hasPreviousPage":false,"activityType":"all","actor":null,"timePeriod":"all","sort":"DESC","perPage":30,"cursor":"djE6ks8AAAAEOTz3iAA","startCursor":null,"endCursor":null}},"title":"Activity ยท davidkoski/mlx-examples"}