tech:

taffy

Parameter efficient tuning methods (PETM)

Parameter efficient tuning methods (PETM) are techniques used in artificial intelligence (AI) and machine learning (ML) to optimize model performance, while minimizing the computational resources and time required for tuning. PETM aims to strike a balance between achieving high-quality results and efficiently utilizing resources.

Tuning models often involves exploring a vast space of hyperparameters to find the optimal configuration. However, this process can be computationally expensive, especially for complex models and large datasets. PETM techniques address this challenge by providing efficient strategies for tuning models effectively.

PETM encompasses a range of approaches, including:

  1. Bayesian optimization: This method utilizes Bayesian inference to model the performance of the model based on observed hyperparameter configurations. By leveraging this probabilistic model, Bayesian Optimization efficiently selects promising hyperparameter settings to explore, reducing the number of evaluations required.
  2. Genetic algorithms: Inspired by natural evolution, Genetic Algorithms employ a population-based search strategy to iteratively optimize models. By selecting the most promising hyperparameter combinations, combining them, and applying genetic operators like mutation and crossover, this method efficiently explores the hyperparameter space.
  3. Model-based optimization: This approach builds a surrogate model, such as a Gaussian Process, to approximate the performance of the model for different hyperparameter configurations. By iteratively updating the surrogate model based on observed evaluations, Model-Based Optimization efficiently guides the search towards promising regions of the hyperparameter space.
  4. Transfer learning: Transfer Learning leverages knowledge gained from tuning similar models or datasets to accelerate the tuning process for a new model. By transferring insights and learned hyperparameters, this method reduces the search space, allowing for more efficient tuning.

The effectiveness of PETM depends on factors such as the problem domain, dataset size, and the complexity of the model. Choosing the most suitable PETM approach requires careful consideration of these factors and understanding the trade-offs between efficiency and performance.


 

Just in

Reddit hasn’t turned a profit in nearly 20 years, but it just filed to go public anyway — CNN

Reddit — which is not yet profitable — says it seeks to grow its business through advertising, more e-commerce offerings and by licensing its data to other companies to train their artificial intelligence models, writes Clare Duffy and John Towfighi.

Leidos awarded $143M Defense Intelligence Agency technology platform contract

Leidos has obtained a task order contract from the Defense Intelligence Agency's (DIA) Science & Technology Directorate. This contract tasks Leidos with the creation and implementation of a comprehensive system for managing open-source intelligence

Staff say Dell’s return to office mandate is a stealth layoff, especially for women — The Register

The implications of choosing to work remotely, we're told, are: "1) no funding for team onsite meetings, even if a large portion of the team is flying in for the meeting from other Dell locations; 2) no career advancement; 3) no career movements; and 4) remote status will be considered when planning or organization changes – AKA workforce reductions," writes Thomas Claburn. 

Orkes raises $20M

Cupertino, CA-based Orkes, a company focused on the scaling of distributed systems, has raised $20 million.

Motorola Solutions appoints Nicole Anasenes to board

Motorola Solutions announced the appointment of Nicole Anasenes to its board of directors. Ms. Anasenes has over two decades of experience in leadership roles across software and services, market development, acquisitions, and business transformation.