Commit History

Author SHA1 Message Date
  Nel Nibcord 42172b2c39 Updated unit tests 1 year ago
  Nel Nibcord 8b71d57da7 Removed inference state entirely 1 year ago
  cadenmackenzie f67e18a797 adding option to expand error to see stack trace and clearing timeout if message is expanded 1 year ago
  cadenmackenzie ead8e28915 adding timeout back but making it 30 seconds 1 year ago
  cadenmackenzie 325eddddd5 modifying error handling to include name and stack trace if available. css support multiple lines 1 year ago
  cadenmackenzie cbe551d119 removing timeout for error and adding close button 1 year ago
  Alex Cheema 4c98108d3e increase grpc msg limit 1 year ago
  Alex Cheema 1946373716 fix debug log 1 year ago
  Alex Cheema 42db2ffbc8 Merge pull request #440 from blindcrone/debug-fix 1 year ago
  Nel Nibcord f02e62c9c0 Neglected to backpropagate this debug output fix from my training branch 1 year ago
  Alex Cheema fb8933547a Merge pull request #437 from blindcrone/unit-tests 1 year ago
  Nel Nibcord 03924cf9af Need tokens. Also, for some reason this gets mad if we have non-integral tokens but this isn't a problem elsewhere? 1 year ago
  Alex Cheema 854a7c22ac Merge pull request #436 from blindcrone/unit-tests 1 year ago
  Nel Nibcord e463cd8196 Ok not sure we're using this but just in case 1 year ago
  Nel Nibcord 7e3ad9abc8 Missed a spot 1 year ago
  Alex Cheema b8b4ea3633 Merge pull request #435 from blindcrone/unit-tests 1 year ago
  Nel Nibcord 1cd3efbe4c Fixed unit tests 1 year ago
  Alex Cheema b400a442ee Merge pull request #420 from blindcrone/refactor-inference 1 year ago
  Nel Nibcord 65fdc99ccc Call no longer needs request_id 1 year ago
  Nel Nibcord 90518a3bbe Hoisted caching to a wrapper class 1 year ago
  Nel Nibcord bf33ffde87 This doesn't need to be a tuple really 1 year ago
  Nel Nibcord 10e9f44a10 one-line output buffering 1 year ago
  Nel Nibcord 52ef6ee4a3 Made temperature and top_p available to the inference engine sample interfaces 1 year ago
  Nel Nibcord 8205a5aebc Implemented per-request caching in tinygrad 1 year ago
  Nel Nibcord 13572e6a40 Some stability improvements for tinygrad inference 1 year ago
  Nel Nibcord aefc0d7c51 I think this is more faithful to how it was originally done 1 year ago
  Nel Nibcord c06b5f3b56 Corrected type annotations 1 year ago
  Nel Nibcord 9b66758b59 Make sure they're np arrays 1 year ago
  Nel Nibcord b9d0fb6825 Since infer_prompt is a thin wrapper that works the same for all inference engines, we can de-abstract it 1 year ago
  Nel Nibcord 527c7a6e49 Applied new interface to tinygrad and dummy inference engines 1 year ago