South East Farm Press Logo

Breeding better hogs with artificial intelligence

Tech for ultrasound image interpretation, body weight estimation, and testicular ultrasound may determine productive versus nonproductive boars.

John Hart, Associate Editor

April 24, 2024

4 Min Read
pigs being examined
dusanpetkovic/iStock/Getty Images

As general manager of premium genetics for Smithfield Foods, Kent Gray’s job is to develop the genetic product that goes into the hogs Smithfield brings to market. It’s a complicated job that requires capturing a great deal of data.  

But it can be made easier with artificial intelligence (AI) and machine learning. 

“Genetics is really based on data. We need to understand the data and utilize the data in order for us to be more accurate in identifying those animals that are best parents within the population,” Gray said at an Emerging Research Showcase on generative AI at the North Carolina Biotechnology Center in Research Triangle Park April 2. 

Gray said his department at Smithfield Foods deals with 865,000 sows, which equates to 17 million market hogs that go to all Smithfield packing plants. He said capturing sufficient data to bring the best hogs to market is time consuming, labor intensive, and requires a complex understanding of the animals. 

“The labor is what is causing us the most issues right now. It’s hard to find individuals that are willing to capture this information and also have the know-how to be able to do it,” Gray said at the showcase, sponsored by North Carolina State University College of Agriculture and Life Sciences. 

Related:Skilled farmers still needed in age of AI

Weighing AI 

Smithfield Foods is examining AI and machine learning for ultrasound image interpretation, body weight estimation, and testicular ultrasound. Gray says the goal is to determine productive versus nonproductive boars and bring the most productive hogs to market.  

“The first way that we actually utilize this technology was just a few years ago we realized that it was taking a lot of people hours to take images from ultrasound data and interpret them. We need information in order to select animals before they die. I want to make sure we select the right animal, know what its back fat and loin depth area without having to kill it,” Gray said. 

Gray’s department uses AI for ultrasound image interpretation to take images of each animal and train the program to be able to identify back fat thickness and loin depth by identifying where the ribs are and measuring it. 

Smithfield is collaborating with Virginia Tech to determine how AI and machine learning can be used to estimate the weight of an animal. “Weighing an animal is important. I don’t think we a will ever get to be accurate enough to be able to use this in genetics, but we could we get the weight of an animal for us to be able to market animals better.” 

Gray said it is important for Smithfield to get a standard deviation for the packing plant because there is a particular animal size that is requested or needed at the packing plant. He said AI and machine learning could help them do a better job and select the best hogs that provide more value to Smithfield Foods. 

Ultrasound

Smithfield Foods is also collaborating with Virginia Tech to develop machine learning to ultrasound the testicles of hogs in order to identify boars that will be more reproductively sound before Smithfield breeds them on the farm. Gray said it takes four months for a hog to actually farrow before Smithfield can recognize if there are reproduction problems. 

“The challenges that we face are collecting data in barns. Barns are dirty. We wash them constantly and that does cause some issues with cameras and things like that. We don’t have high speed internet where we go to collect this information,” he explained. 

Gray also sees potential for AI and machine learning at Smithfield’s packing plants. He noted that Smithfield captures a lot of data from pigs it harvests at the packing plant. 

“When we harvest those pigs, they have to have their identification taken out of their ear -- it’s an ear tag. Then we have to follow the animal through the entire process through getting the different cuts out of the animal. I know there is an opportunity to utilize this technology for us to be able to identify animals throughout this process, keep track of them, and be able to better utilize the accuracy of using these models in order for us to be able to select animals,” he said.

Read more about:

Breeding

About the Author(s)

John Hart

Associate Editor, Southeast Farm Press

John Hart is associate editor of Southeast Farm Press, responsible for coverage in the Carolinas and Virginia. He is based in Raleigh, N.C.

Prior to joining Southeast Farm Press, John was director of news services for the American Farm Bureau Federation in Washington, D.C. He also has experience as an energy journalist. For nine years, John was the owner, editor and publisher of The Rice World, a monthly publication serving the U.S. rice industry.  John also worked in public relations for the USA Rice Council in Houston, Texas and the Cotton Board in Memphis, Tenn. He also has experience as a farm and general assignments reporter for the Monroe, La. News-Star.

John is a native of Lake Charles, La. and is a  graduate of the LSU School of Journalism in Baton Rouge.  At LSU, he served on the staff of The Daily Reveille.

Subscribe to receive top agriculture news
Be informed daily with these free e-newsletters

You May Also Like